FINALLY a good DxOMark rating system review

Solarflare

No longer a newbie, moving up!
Joined
May 24, 2012
Messages
2,898
Reaction score
395
Do You Believe in DxOMark? | byThom | Thom Hogan
[...] That’s referring to DxOMark’s “overall score” value, which if anyone can clearly explain how they come up with that single number, let alone what it actually means in a pragmatic sense, I’d be appreciative. [...] Quick way to know that it’s faux: what’s the difference you’d see in a camera labeled “70” versus “72”? Right, thought so.

I’ve written before how the digital camera world has basically followed the old high fidelity world in terms of numbers-oriented marketing and testing. Back in the 70’s and 80’s people got crazy about frequency or power ratings for high fi gear. Yet in the long run, people who listened to the products found that slavish devotion to buying better numbers didn’t actually get them “better sound.”

One thing a lot of folk don’t get is that raw tests tend to be demosaic dependent. As I’ve pointed out before, Adobe converters do a better job with little override on Canon raw files than they do on Nikon raw files. [...]

[...] At base ISO, the two are close, sure, but look at what happens as you boost ISO. Indeed, if you’re going to look at DxOMark tests, I suggest that you always pull up the “Measurements” tab and look closely at what’s going on there. [...]

It’s interesting to note that DxO seems to be playing a lot of angles. First, they are presenting themselves as impartial, numeric oriented testers (e.g. the scores). Second, they are presenting themselves as reviewers (e.g. "If Canon could only address performance at base and low ISO, the EOS 7D Mk II would make a thoroughly convincing all-round choice, but in this category the Sony A77 II looks to be the more compelling option."). Third, they sell their test equipment and software test suites to camera companies (Nikon, for instance, but I don’t believe Canon is one of their clients). Fourth, they present themselves as the best demosaic option, better than the camera makers’ options (e.g., DxO Optics Pro). They have some clear conflicts of interests that are not easily resolved. So be careful of just gobbling up their “results” as absolutes.

[...]

So, for example, the 7DII has a landscape dynamic range of 11.8EV, the D7100 a range of 13.7EV. Before moving on, I should note that these aren’t exactly what you’d get out of the camera in images (e.g. 12 stops of data on the 7DII, 14 on the D7100). DxOMark’s numbers are closer to engineering DR than usable dynamic range, even with their adjustments for “print” and “screen." Engineering DR is measured between the lowest value that hits a signal-to-noise ratio of 1:1 (which we’d never use) and the highest saturation value of the sensor. Pragmatic DR has no agreed upon definition. When I report dynamic range numbers, as I do in my books, these are based upon my own personal standards, which have some fairly strict observed guidelines towards visibility of noise of any kind.

Even though those DxO-reported numbers aren’t what I’d expect in my images, they do tell me something: I might be able to dig more shadow detail out of the D7100 image than the 7DII image at base ISO, all else equal. That’s actually no surprise. This has been true of the Sony versus Canon sensor difference for quite some time now. If you’re a big fan of cranking the Lightroom Shadow slider to max, you’re going to be doing more noise correction on a Canon than a Nikon. Nothing’s changed here. However I’d point out that neither camera is likely to be able to hold extreme landscape situations in a single exposure: I’d still be bracketing and applying HDR type techniques to construct images in such cases. [...]

So I guess the point I’m trying to make here is to understand why it is that you’re even looking at test results such as DxOMark. You can certainly draw some conclusions from their results. But not “92 is better than 76” or any other Overall Score mark. You have to look at something specific, e.g. dynamic range, and understand why that might be important to you.
Now this is a very nice review of the DxOMark rating system, that gave me some pieces of information I wasnt aware of before. I've always wanted some more indepth comment on DxOMark, because frankly, I always felt these numbers are quite questionable.

The total number I've always ignored anyway, because, as Thom said: I have no clue what it even means. So, the Canon 7D Mk II has same total number as Nikon D300s ? Riiight. And what does that even mean ?

The individual numbers are much better, but already misleading. For example the D4s reportedly gets into a couple banding problems at really high ISOs, but has less noise overall than the Sony A7s, which however has no banding problem. How exactly is that reflected in the DxOMark rating ? Not at all. Or the Leica M9 vs M Type 240 - there was a really interesting article about those two on Steve Huffs site, recently:
In practice, shooting in low light was nearly as limited for the M240 as it was for the M9, which has a practical ISO limit of around 640, after which banding behaviors are the norm with image adjustment.
So yeah, again the DxOMark rating is misleading.

And about this part with 'adjustments for “print” and “screen."' - as an hobbyist, the question "can the output media display it ?" never had much weight for me. Frankly I just want as much quality as possible for any future media, not just for current media. I kind of think of my images more as statements for eternity, not as a "use now and discard quickly later" entities. Thus I rather want to catch moments in time, NOW - in the best quality already possible. For pretty much the same reason I still monitor companies like Leica, hoping for a camera that I can buy once and after ignore the camera market after for a decade or whenever the camera breaks. Though technically my Nikon is probably already pretty close to that level already, anyway. Either way, ergonomics and build quality matters to me, thus my rating of a company like Leica in respect to bang for the buck might be a bit different from other people.


P.s.: This site's tendency to "shorten" quotes is really annoying, as is the "invisible link" policy.
 
Wow mark the date and time because I actually agree with Thom!

All "Big rubber stamp" numbers and grades are worthless. I blame Siskel & Ebert for their "thumbs up and down" approach to movie reviewing.
 
Yeah I hate dxomark too. When I want to see some objective measurements about cameras and sensor i use one of the many other much better places. Err. Wait....

All measurements are useless unless you understand what they're measuring. But if you don't message something, then all you've got is audiophile style subjective nonsense, which is much worse than useless.
 
Don't like dxo Mark ratings to much. A 1d2 I had used produce very nice files. Dxo have it way down their list. From Thoms link the d7100 does a little better than the 7d2 at base and they equalize at higher iso. That's a very good result for Nikon, and not to bad for Canon as it is better than previous gen tech
 
The thing with "scientific" studies is that the data collection is the easy part. Interpreting the data is where it gets tricky and where bias can creep in.
 
A well-written article describing how DxO Mark tests sensors is here, on The Luminous Landscape web site: DxOMark Camera Sensor

This article about DxO Mark has some really good graphs that show the performance of sensors over time (a decade), and also explains what some of the tests measure, and what the results mean.

Earlier article explaining the DxO Mark testing procedures:DxOMark Sensor For Benchmarking Cameras

As Hogan's article says, be wary of assigning a lot of weight to the "Overall Score", especially if you have no clue about the other measurements. It's interesting that Hogan tries to discredit DxO Mark numbers, but then in the same article, uses their data to back up and verify empirical observation that hundreds of thousands of users, and he himself, have made. He bashes on one hand, then he goes on to use the same testing procedures and their data to advance several of his points of view. He basically validates different types of DxO Mark measurements, but does warn about the "Overall Score" being looked at in isolation. Yeah...no $hi+ Thom...that's why there are multiple metrics tested, over the entire ISO range of the cameras under test.
 
Regardless of the scores the fact still remains that Canon's consumer sensors at this time are inferior to Sony's. Maybe not as inferior as DxO would like us to believe, but inferior nonetheless.

I don't understand what he's trying to prove, other than "Use common sense and take these # with a grain of salt."
 
The thing is that without measuring SOMETHING there's no way to compare things.

Sure it would be great to also have an essay about the sensor, and those exist on the web in great thundering herds. If you want to compare one to another, subjective won't do it. You could have Scientist Bob look at photos from both and say 'well, the XYZ sensor exhibits more banding at high ISOs than sensor ABC' but then if Bob is going to help us sort through 100 different sensors he's going to have to look at, what is it, 4950 different pairs of sensors, and give us his subjective evaluation of each pairing.

Alternatively, you could devise a metric for high ISO banding, and assign a numerical value to each one, and then you could compare the numbers.

And the number would be "wrong" some of the time, according to some people's opinions and observations of some sensors under some conditions.

That's the way it is. Numerical measurements have their place, and it's certainly important to know what that place is if you're planning to use them, but they DO have a place.

Luckily all modern sensors are awesome so unless you have a pretty specific application in mind, none of it really matters. Everything is good enough. Everything has limitations, which you'll want to either avoid or learn to work with.
 
Luckily all modern sensors are awesome so unless you have a pretty specific application in mind, none of it really matters. Everything is good enough. Everything has limitations, which you'll want to either avoid or learn to work with.

Pretty much this.

Any camera over $1,500 is pretty damn good and would do the job very well 99.9% of the time. I think we (the internet) have a tendency to nitpick the hell out of every little thing until the point where people get a negative view of anything but the latest and greatest.

Fact of the matter is I can go on flickr look at the popular photos and not be able to guess what brand/model was used for any other images because the differences are so damn slight. More to the point I've seen amazing photos taken with 10 year old camera and horrible photos taken by the newest.
 
A well-written article describing how DxO Mark tests sensors is here, on The Luminous Landscape web site: DxOMark Camera Sensor

This article about DxO Mark has some really good graphs that show the performance of sensors over time (a decade), and also explains what some of the tests measure, and what the results mean.

Earlier article explaining the DxO Mark testing procedures:DxOMark Sensor For Benchmarking Cameras

As Hogan's article says, be wary of assigning a lot of weight to the "Overall Score", especially if you have no clue about the other measurements. It's interesting that Hogan tries to discredit DxO Mark numbers, but then in the same article, uses their data to back up and verify empirical observation that hundreds of thousands of users, and he himself, have made. He bashes on one hand, then he goes on to use the same testing procedures and their data to advance several of his points of view. He basically validates different types of DxO Mark measurements, but does warn about the "Overall Score" being looked at in isolation. Yeah...no $hi+ Thom...that's why there are multiple metrics tested, over the entire ISO range of the cameras under test.

Thom just wants to sell more of his books :D


Sent from my iPhone using Tapatalk
 
I think some of you guys forget one very important think, DXO is the only source we have to judge sensors.
I read a ton of reviews written by people that I don't know what they truly know and don't know, how good they are and what is their agenda.
Its easy to bash DXO but the truth is that we all need it.
I would base a purchase just by what DXO is saying but I would definitely use them as one reference (out of many) when I decide what to go for.
I find there is a lot of very good and useful intimation on this site and I will keep using it as one important reference.
 
I don't need it, so it's not the truth. It's the internet so we pick and choose what we want to bash - always easy.
 
BLASPHEMY :icon_pissedoff::madass::popcorn: Who am I supposed to believe now? :miserable: I have no where or no one to :Hail.sml: :BangHead::BangHead:
Just email and ask me.
I'll give you an unbiased, uneducated FINAL statistic on which camera(s) I think are the BEST and that you SHOULD blindly buy on my recommendation.
 

Most reactions

Back
Top