D7000 sensor benchmark results are in!

cfusionpm has been grinding the DxO Mark axe for months, simply because his favorite camera tests out so poorly (in the mid-60's). But, as a newcomer, you probably do not really understand who you're relating to,or what their hidden biases are. You sound like a Canon fanboi to me...

chito beach, what Derrel neglects to mention are all the times I would say things like how great the higher end Nikon cameras are. And how much I wish Canon would release a full frame speed/sports camera like the D700 and D3/s. The only reason I'm with Canon is because that's where I'm invested and it doesn't really make sense to switch. Right now, I would absolutely love to have a D700/D7000 combo instead of 7D/50D (or theoretical 7D/5DII). These "hidden biases" are fabrications Derrel likes to make to fluff off and ignore anything I have to say; even when he agrees with me! :thumbup:

I agree that the overall DxO mark number is meaningless. However, the individual measurements that feed in to the overall score (color depth, dynamic range, low light performance) as well as the detailed plots for each measure are meaningful and quite useful.
This is also a good point. But as this article points out, it takes a difference of 5 full points to make a difference of 1/3 of a stop.

To quote them:
"But eventually I stopped relying on it because I was finding a growing disconnect between the results that I was seeing from some equipment and the numbers being generated by Optics Pro. Eventually I returned to doing subjective reviews, which I have continued ever since. Nevertheless since then quite a few organizations have adopted the DxO Optics Pro testing system, including Popular Photography magazine in the US and Chasseur d'Image in France.

When the company's DxOMark pages first went online last November I was positive about it because I felt that the engineers and scientists at DxO really know their stuff, and that the industry could use an impartial technical yardstick by which to measure digital camera performance.

Since then though I have become increasingly concerned, because their DxOMark metric provides people with a number, precise to within one decimal point, but which has become misunderstood by many.

Firstly, such a level of precision is essentially meaningless. Statisticians call it spurious precision, since it creates an impression of accuracy that isn't at all relevant.

Does a ranking of 62.3 really differ in any meaningful way from 63.8? No, not at all. In fact DxO points out that a measure smaller than 5 is hardly perceptible, representing just a 1/3rd stop difference. "
So very large differences can be useful in evaluating a camera, but it seems more often than not, the differences are so small that the human eye wouldn't even tell a difference. Factor in post production (which is done on pretty much all images, to some degree) and it's even less important. What can be more useful, especially for real world applications, are visual image examples. It's something DxO lacks entirely and has been the point I'd been arguing all along.
 
How much effect does the higher pixel pitch on the D90 have? It has to contribute to the higher score also

In general, pixel pitch is a factor. All else being equal, more spaced apart pixels will give better low-light performance.

But it's probably not much of a factor here. The difference in pixel pitch between the D90 and D300s is about 1% while the difference in low light performance is close to 25%. So there has to be a bigger factor in play and my money is on the A-to-Ds.

thanks that is the answer Ive been looking for since the start
 
This has become another one of those Digital VS Analog case. It'll never end. Just be happy with what you have and let your final product be the judge.
 
What can be more useful, especially for real world applications, are visual image examples. It's something DxO lacks entirely and has been the point I'd been arguing all along.

DxO mark is a benchmark. Nothing more. Nothing less. It's not meant to replace visual evaluations. But it does provide useful information.

In an ideal world, we should compare images from different cameras of the same subject taken by the same photographer under the exact same lighting. For every camera we are interested in. Under all circumstances of interest to us. Is that practical? Not for most of us. So in the non-ideal world we live in, benchmarks can be used to draw inferences on what to expect.

Think of it like the 0-60 time of a car. No one buys a car purely for the purpose of repeatedly going from 0-60 mph and nothing else. But the 0-60 time does give useful information about the car's capabilities and what you could expect in some real-life situations. And it's an easily repeatable experiment that can be compared across different models without subjective influence from the person doing the testing.

If my car takes longer to go from 0-60 than a Ferrari 599, it doesn't mean my car is useless. It also doesn't mean I should take to the streets bashing 0-60 time as a benchmark. Same with the DxO mark. If your camera has a low DxO mark score, think of it as a hint to the manufacturer to improve that aspect the next time around. Instead of bashing the benchmark, you should be demanding the manufacturer to improve so that your next camera can be even better than the one you have and love today.
 
What can be more useful, especially for real world applications, are visual image examples. It's something DxO lacks entirely and has been the point I'd been arguing all along.

DxO mark is a benchmark. Nothing more. Nothing less. It's not meant to replace visual evaluations. But it does provide useful information.

In an ideal world, we should compare images from different cameras of the same subject taken by the same photographer under the exact same lighting. For every camera we are interested in. Under all circumstances of interest to us. Is that practical? Not for most of us. So in the non-ideal world we live in, benchmarks can be used to draw inferences on what to expect.

Think of it like the 0-60 time of a car. No one buys a car purely for the purpose of repeatedly going from 0-60 mph and nothing else. But the 0-60 time does give useful information about the car's capabilities and what you could expect in some real-life situations. And it's an easily repeatable experiment that can be compared across different models without subjective influence from the person doing the testing.

If my car takes longer to go from 0-60 than a Ferrari 599, it doesn't mean my car is useless. It also doesn't mean I should take to the streets bashing 0-60 time as a benchmark. Same with the DxO mark. If your camera has a low DxO mark score, think of it as a hint to the manufacturer to improve that aspect the next time around. Instead of bashing the benchmark, you should be demanding the manufacturer to improve so that your next camera can be even better than the one you have and love today.

well said :thumbup:
 
DxOMark - Sensor rankings

Ranks (scores) of those that are probably of interest here:

8. Nikon D700 (80)
10. Nikon D7000
(80)
11. Canon 5DMKII (79)
28. Nikon D300s (70)
35. Nikon D300 (67)
41. Canon 60D (66)
44. Canon 7D
(66)

Nikon D7000 had better DR than the D700, but the D700 won in low light. This resulted in an equal score.

Suck it Canon users :lol:

Please you Canon fans....go BACK to the ORIGINAL POST, quoted above...and then tell me why you're even HERE...seriously Matty, I'd loooove to hear why you are in this thread....and you too chito beach.
Seriously...I pointed out that PENTAX is the actual champion of the APS-C category. Is it that the beloved 7D is in 44th place that stings so much? Or what? You guys are pretty amusing.
 
In an ideal world, we should compare images from different cameras of the same subject taken by the same photographer under the exact same lighting. For every camera we are interested in. Under all circumstances of interest to us. Is that practical? Not for most of us. So in the non-ideal world we live in, benchmarks can be used to draw inferences on what to expect.
If I had the money, I would happily undertake this endeavor. I know The Digital Picture has a fantastic database for lenses, though the bodies are generally whatever he uses (usually the top representative at the time of testing for each sensor type). It looks like he is slowly expanding to cover Nikon things as well.

Think of it like the 0-60 time of a car. No one buys a car purely for the purpose of repeatedly going from 0-60 mph and nothing else. But the 0-60 time does give useful information about the car's capabilities and what you could expect in some real-life situations. And it's an easily repeatable experiment that can be compared across different models without subjective influence from the person doing the testing.
Being a car enthusiast myself, I think a better metaphor would be stats like horsepower, torque, or even curb weight. 0-60 is an application of all the elements of the car; a real world test with recordable results. Stats like that on the other hand, are calculated numbers (similar to DxO's calculations of RAW data), giving you only a small piece of what it's like to drive that car.

Saying a car has 240hp, 150ft/lbs torque, and weighs 2800lbs can give you an idea of how it accelerates, but looking at 0-60, 1/4 mile time + trap speed, slalom speeds, or lap times compared to other cars all together will give you a much better and clearer picture of that car's performance.

My point isn't that I'm bitter about something scoring higher or lower than another, its that the system itself isn't really that useful. Just like hp/tq numbers, they are just a single element among way too many other variables. Great for pissing matches, advertising, and giving you somewhat of a rough idea about things, but way too many other factors exist in producing an image.

I would also like to say that I appreciate your ability to discuss this friendly spirited manner, and I think we actually agree for the most part. We are just approaching it with different viewpoints.

DxOMark - Sensor rankings

Ranks (scores) of those that are probably of interest here:

8. Nikon D700 (80)
10. Nikon D7000 (80)
11. Canon 5DMKII (79)
28. Nikon D300s (70)
35. Nikon D300 (67)
41. Canon 60D (66)
44. Canon 7D (66)

Nikon D7000 had better DR than the D700, but the D700 won in low light. This resulted in an equal score.

Suck it Canon users :lol:

Please you Canon fans....go BACK to the ORIGINAL POST, quoted above...and then tell me why you're even HERE...seriously Matty, I'd loooove to hear why you are in this thread....and you too chito beach.
Seriously...I pointed out that PENTAX is the actual champion of the APS-C category. Is it that the beloved 7D is in 44th place that stings so much? Or what? You guys are pretty amusing.
Still waiting on this one. ;)
 
Last edited:
For a guy who has been here less than two weeks, I find you quite abrasive, and would say you're not fitting in here too well.

What does time spent here have to do with anything? I find you obnoxious, irritating(I know im not alone in those thoughts either!) and arrogant... but apparently my opinion doesn't mean squat because I have been here even less time than him.

You have 6000+ posts in a little over a year...jeez man, get out and live life a little. Go put all that crap you spill out to use...take some pictures and "wow" us or something. :lol:
 
Last edited:
For what it's worth, I've found his posts to be quite informative and well written - so much so that I've looked back at his previous posts in other threads to which he's contributed a lot of material and knowledge to the site.

Time spent on a site doesn't necessarily make you better than anyone else, however it does help you understand other members e-titude and manuerisms. Obviously there's something here that stems back before my time, and yours.

I usually weed through the BS and get what relevant information I can out of threads, this one could be interesting if there was more logical debate than finger pointing however.
 
I agree that the overall DxO mark number is meaningless. However, the individual measurements that feed in to the overall score (color depth, dynamic range, low light performance) as well as the detailed plots for each measure are meaningful and quite useful.
This is also a good point. But as this article points out, it takes a difference of 5 full points to make a difference of 1/3 of a stop.

Isn't that article referring to the overall number, which I agreed wasn't the useful part of DxO mark results? Besides, if a 5-point difference implies a 1/3 f-stop improvement wouldn't a difference of, say 14, be worth paying attention to?

In an ideal world, we should compare images from different cameras of the same subject taken by the same photographer under the exact same lighting. For every camera we are interested in. Under all circumstances of interest to us. Is that practical? Not for most of us. So in the non-ideal world we live in, benchmarks can be used to draw inferences on what to expect.
If I had the money, I would happily undertake this endeavor.

The key operative word here being "if", which is precisely the difference between the ideal world and the real world I was referring to.

Think of it like the 0-60 time of a car. No one buys a car purely for the purpose of repeatedly going from 0-60 mph and nothing else. But the 0-60 time does give useful information about the car's capabilities and what you could expect in some real-life situations. And it's an easily repeatable experiment that can be compared across different models without subjective influence from the person doing the testing.
Being a car enthusiast myself, I think a better metaphor would be stats like horsepower, torque, or even curb weight. 0-60 is an application of all the elements of the car; a real world test with recordable results. Stats like that on the other hand, are calculated numbers (similar to DxO's calculations of RAW data), giving you only a small piece of what it's like to drive that car.

Saying a car has 240hp, 150ft/lbs torque, and weighs 2800lbs can give you an idea of how it accelerates, but looking at 0-60, 1/4 mile time + trap speed, slalom speeds, or lap times compared to other cars all together will give you a much better and clearer picture of that car's performance.

First of all, I wasn't using a metaphor. It was an analogy. :)

Second, my original analogy is exactly the right one. DxO mark scores are measured results based on RAW images, just like the 0-60 or the quarter mile or the slalom. And much like those automotive measurements, the DxO mark is measured in a controlled, repeatable environment.

The power (hp) and torque (lb/ft) ratings, weight etc. are the specs provided by the manufacturer. In the camera world, these are the number of megapixels or the bits per pixel or the ISO range printed in the specifications page of your manual.
 
Last edited:
For a guy who has been here less than two weeks, I find you quite abrasive, and would say you're not fitting in here too well.

What does time spent here have to do with anything? I find you obnoxious, irritating(I know im not alone in those thoughts either!) and arrogant... but apparently my opinion doesn't mean squat because I have been here even less time than him.

You have 6000+ posts in a little over a year...jeez man, get out and live life a little. Go put all that crap you spill out to use...take some pictures and "wow" us or something. :lol:

As messages on forums go, they don't get much more uselessly dismal than this one. As abrasiveness goes, in time, you'll find that Derrel is mild in comparison and more often than not, knows what he's talking about. I haven't been here long either, yet I've been here long enough to at least see this. Having 6000 posts equates to being a contributor here. Is there something wrong with that? Why are you here? You on the other hand have contributed that your opinion doesn't mean squat. I believe you. (I know I'm not alone in those thoughts either!)
 
I agree that the overall DxO mark number is meaningless. However, the individual measurements that feed in to the overall score (color depth, dynamic range, low light performance) as well as the detailed plots for each measure are meaningful and quite useful.

This seems a bit contradicting to me. Just because the DxO mark number is unitless, doesn't make it meaningless (e.g. f-stop). You stated yourself that all of the measurements effecting the overall number have meaning, so wouldn't it make sense that the overall rating itself also has meaning?

Saying a car has 240hp, 150ft/lbs torque, and weighs 2800lbs can give you an idea of how it accelerates, but looking at 0-60, 1/4 mile time + trap speed, slalom speeds, or lap times compared to other cars all together will give you a much better and clearer picture of that car's performance.

My point isn't that I'm bitter about something scoring higher or lower than another, its that the system itself isn't really that useful. Just like hp/tq numbers, they are just a single element among way too many other variables. Great for pissing matches, advertising, and giving you somewhat of a rough idea about things, but way too many other factors exist in producing an image.

What really sucks is when you've spent money on a car that, at the time, seemed to have great horsepower, torque, and low weight. Then someone rolls up in their brand new car a year later that not only has more horsepower, higher torque, and weighs less... but it cost them about 30% less to get!

BTW, I wouldn't consider the ability to shoot in lower light and achieve higher DR something to bring to a pissing match or an advertising gimic, but instead are truly useful attributes to a camera.
 
I agree that the overall DxO mark number is meaningless. However, the individual measurements that feed in to the overall score (color depth, dynamic range, low light performance) as well as the detailed plots for each measure are meaningful and quite useful.

This seems a bit contradicting to me. Just because the DxO mark number is unitless, doesn't make it meaningless (e.g. f-stop). You stated yourself that all of the measurements effecting the overall number have meaning, so wouldn't it make sense that the overall rating itself also has meaning?

What I meant by "meaningless" is that it's hard to reason about the overall number. Two cameras that have the same overall score can have different characteristics (e.g. one can have better dynamic range while the other can have better low-light performance).

The components of the score, on the other hand, clearly show the characteristics of each camera. They are also more meaningful from a usage point of view. If you do low-light photography, you may care more about the high ISO measurement. If you do mostly studio stuff, you may care more about color depth etc.
 
Second, my original analogy is exactly the right one. DxO mark scores are measured results based on RAW images, just like the 0-60 or the quarter mile or the slalom. And much like those automotive measurements, the DxO mark is measured in a controlled, repeatable environment.

The power (hp) and torque (lb/ft) ratings, weight etc. are the specs provided by the manufacturer. In the camera world, these are the number of megapixels or the bits per pixel or the ISO range printed in the specifications page of your manual.
OK, so let's ret-con my original post to assume that the hp and tq curves were pulled off a dyno run (a measured, repeatable test to generate numeric values). Now, my main point is that seeing a power curve is interesting and can be useful in certain specific situations, but in no way is it going to tell you a car's lap time. The only way to get the full picture, IMO, is to run it around a circuit against its rivals and compare times (in addition to 0-60, 1/4 mi/trap, or a plethora of other subjective and objective tests auto reviewers use). Each part is not a sole decider, but rather a piece that plays a role in defining the whole. I think my personal frustration with this kind of rationale comes from hearing arguments like "car X is faster than car Y because it has more hp" which is akin to "camera X takes better pictures than camera Y because it has a higher DxO rank."

So like I said earlier, I think we mostly agree with the fact that DxO tests can provide (semi-)useful information, but is not the main factor in final image quality. I think where we differ is just how big of a part it does play. Well that, and analogies. :p
 

The D300s is newer and has a similar sensor - and yet... the sensor rating is lower. That doesn't make a whole lot of sense.

But it seems to be consistent with actual performance. PhotoXopher is not the first one to notice it. If so... Nikon ****ed up, not DxO. Or DxO got it right by accident. Or...

D300s has the same sensor performance as the D300, which is older than the D90. The D90 has been proven to have better ISO performance than the D300 (not just on DxO).... just barely though.

Looks like DxOMark.com is off-line (I wonder why :lol: ), but from what I recall, the D300s sensor is a tweaked D300 sensor with somewhat better performance. The sensors are not identical.
 

Most reactions

Back
Top