Need a GPU for Lightroom 5.7 photo editing on a 27 incher monitor and 4 year old rig

Philmar

TPF Supporters
Supporting Member
Joined
Oct 13, 2014
Messages
4,413
Reaction score
6,440
Location
Toronto
Can others edit my Photos
Photos NOT OK to edit
Impulsively I went online and in to the early morning and in a sleepy mindset I pulled the trigger on a BenQ SW2700PT.
BenQ SW2700PT 27 Inch Adobe RGB Color Management Monitor for Photographers | BenQ Canada
When I awoke this morning I realized that I'd neglected to consider whether my PC can handle it. My expertise in pixels, not bits and bytes.

I don't have a graphics card - I use the onboard Intel® Z77 chipset graphics: Intel HD 4000 on my Ivy Bridge i7 3770K
My mobo is the ASUS P8Z77-V PREMIUM (external link).
https://www.asus.com/ca-en/Motherboards/P8Z77V_PREMIUM/...
From what I've subsequently read the new monitor's increased resolution and resultant complex calculations will tax my CPU. I'm wondering (hoping) that a dedicated GPU will help.

I do NOT game or edit video. I do only light netflix viewing but the biggest demands on my system are RAW photo editing. Moving pixels around and displaying it on a high res monitor involves a lot of CPU calculations.
Will a GPU help my pc performance?
My understanding is that Lightroom 5.7 does not make use of GPU acceleration. A new LR 6.0 version does and this may force an upgrade for me.
I was hoping for a complete new rebuild in 2 or 3 years. I hope it isn't required as a result of the new monitor purchase.
Any recommendations on the best value (bang for buck) GPU that could help me (and ideally be used in my next build)?
I run W10 64 bit, 32 GB RAM, 1 SSD and 2 spin HDs and Antec 620 W PSU.
Sincerest thanks for any helpful suggestions!
 
Thanks.
That is a value choice?
 
Considering your avatar is that of a hobo I guess i was taken aback by that expensive suggestion. :1219:
 
I realize the following site is a commercial site devoted to selling you hardware but their advice is interesting.
Recommended System: Recommended Systems for Adobe Lightroom


Video Card (GPU)

In Lightroom CC 2015 and Lightroom 6, the software is able to utilize the power of your GPU to improve performance when editing images in the Develop module. At the moment, the performance gains are fairly modest, although Adobe has been investing heavily in GPU acceleration. While a high-end GPU is not required to get the benefits of GPU acceleration in Lightroom, it may be a good idea to get a slightly faster GPU than you think you need to help future proof your system.


Lightroom is also very light on VRAM requirements, so even a card with just 2GB of VRAM should be more than enough. However, if you work with large images in Photoshop or use a 4K monitor it is a good idea to use a card that has at least 4GB of VRAM if possible. Workstation video cards are not required for Lightroom, although if you will be using a 30-bit monitor you will need a NVIDIA Quadro video card as GeForce cards currently do not support 30-bit display output.


Although it is likely that Adobe will increase GPU acceleration support in Lightroom in the future, the current demand on the video card is actually relatively light. We recommend either a GeForce GTX 1060 GeForce GTX 1070.


Is my BenQ SW2700PT a 14 bit monitor?
 
An Nvidia GTX 950 with 2GB VRAM would be more than adequate, and its not too expensive either. You will need to be a bit careful and check if your current power supply is enough to power everything in your rig and a new gfx card though there are plenty of online calculators to help with this.

I'd wait until you get the monitor though, for 2D graphics you may well find that the onboard gfx is enough. Lightroom is heavily CPU dependant and barely uses the GFX cards.
 
No one knows what your definition of a "value choice" is unless you provide an actual budget dollar amount.

For instance, a Value choice car could be a luxury car without all the options. To someone else it's a non-luxury car with all the options, or maybe a non-luxury car with no options. Who knows unless you tell us ...

Some people here have $50,000 cameras, others $300. So depending upon who you ask you'll get different answers on "value".
 
Is my BenQ SW2700PT a 14 bit monitor?
No.

The specs for that display say:
Display Colors -‎ 1.07 B‎
You have a 10-bit display.

10 bits can code 1024 colors in each color channel.
There are 3 color channels - red, green, and blue.
1024 x 1024 x 1024 = 1.07 B

A 14-bit display would be trillions of colors, with 16,384 colors per color channel. (4.4 T)
 
Thanks Keith for your helpful post
 
Freddie the Freeloader:
Freddie the Freeloader was the hobo who lived in the city dump. Freddie would also be seen sleeping on a park bench often being told by a passing by police officer to move along or risk be arrested for vagrancy.

Red Skelton Characters, Pigeon Forge, Clem Kadiddlehopper, Freddie Freeloader
No Freddie the Freeloader was a character created by RED SKELTON. Psssst, Santa Clause doesn't exist either. :biglaugh:
And the character is a hobo...
 
My mobo is the ASUS P8Z77-V PREMIUM which has one DisplayPort and one Thunderbolt port. My BenQ has a DP port.
Curiously the monitor does NOT ship with a DP to DP port cable. However, it DOES ship with a Mini DP to DP port cable. Someone told me I could run the monitor with the MiniDP end of the supplied cable plugged in to the mobo Thunderbolt port and the DP end plugged in to the monitor. Is Mini DP compatible with Thunderbolt? Even if the supplied cable does work, would my monitor run better with a DP to DP cable?
 

Most reactions

New Topics

Back
Top