Building an editing computer

You could just go buy a HP or Dell workstation class system and get a second monitor and you will probably be happy with that. The information below would be for a very large and expensive dedicated system that can do photo and video post processing when time is more important than money.


CPU: Go for the i7 975 if you can afford it. Two words Hyper Threading. Photoshop benefits from Hyper Threading in the i7 processor combine that with the higher speed of the 975 and you have a nice mix. The i5 does not have Hyper threading.

RAM: Get as much RAM as you can afford 8 GB is enough but if your motherboard supports get 16 or 32. Photoshop CS4 64 bit can use an unlimited amount of RAM. This means less drive caching and translates to faster performance. How big are your files? Are you going to run Lightroom, Bridge, and Photoshop at the same time? Never skimp on RAM and remember you must run a 64 bit operating system to address more than 3.5 GB of RAM.

VIDEO: Get any card that supports Shader Level 3.0/OpenGL 2.0/DX10 with at least 256MB of VRAM. I would recommend 512MB of VRAM. Also make sure the card support PCI-E x16 buss speeds. This is a motherboard consideration as well.

DRIVES AND STORAGE: In this case we want speed for the OS and Scratch disks. This dictates using RAID 0. With Raid 0 you get no redundancy and are sometimes called strip sets. A single disk failure destroys the entire array because when data is written to RAID 0 drives, the data is broken into fragments. The number of fragments is dictated by the number of disks in the array. The fragments are written to their respective disks simultaneously on the same sector. This allows smaller sections of the entire chunk of data to be read off the drive in parallel, increasing bandwidth.

That being said for the OS or “C” drive 2 or more drives 10,000K, 15,000K SAS, or Solid State SATA in a RAID 0. Photoshop needs a scratch disk to store working files. This should not be on the OS drive or on the drive you are storing the system swap file.
For each scratch disk you want a minimum of 1 better with 2 repeat the configuration for your OS Drive. Now for your storage drive you need to use RAID 6. With Raid 6 you can lose 2 drives without losing any data. This is an improvement over the traditional RAID 5 because RAID 6 uses a second parity bit. As far as your backup goes If you are a home user buy an external Drive USB or ESATA and connect up to backup your Pictures and database files. You may also want to think about an online backup solution or even using Windows home server to automatically backup your system on a daily basis.
Thanks for this info. I see I need to do a bit more research on this and learn a bit more.
 
I have one problem with HDD docks (and I will say right now I haven't used them and haven't come across the problem myself).

These docks are brought from a "design restrictions be damned" mentality. They provide functionality that SATA was never designed for and eSATA was designed to address. Namely the insertion rating of the SATA connector. The connectors on motherboards, and harddisks were only ever rated to 50 insertions after which there is no guarantee that the incredibly thin connections haven't worn away from the connector surface. This kind of damage is one of the things the far more rugged eSATA connector addresses (5000+ insertions).

You may notice a lot of dedicated solutions to this problem which specifically state the insertion rating of the device. I highly recommend you look into a proper external harddisk case with hotswap caddies. The hdd is inserted once into the caddy, and the caddy has connectors with very different ratings on back designed for constant removal and re-connection.

It's not an isolated problem and I saw a while ago a thread on WD's community forums about a rejected warranty claim with a link pointing to the WD website explaining that a sata connector is only designed for single connection applications.

This is your data, so do yourself the favour of spending the few dollars extra.

Interesting. I have not heard of this yet it makes sense.
More research I need to do.
 
Don't forget a a good or at least decent IPS display;There no sense in spending all that money on a computer if you cant see the results accurately. I would go for multiple displays myself. I seem to be much more productive viewing different info on each screen.

As a side note:
Does anyone know If light room will work with more than 2 monitors. I would love to have one screen for the main view, one Loupe screen for landscape oriented photos, and one screen in portrait orientation. Can you view two Loupe windows at once? can it be configured To display images on correctly oriented screens automatically?
Anyway I guess I could email Adobe.


already have an IPS monitor and Spyder.
 
Also, is raid 5 still in? THats mirrored with striping as long as you have 3 identical hard drives... its kind of the same thing as raid 1+0 which uses 4 hard drives.

maybe invest in large SSD cards in striping for super speed then 3 i terabyte drives in raid 5 so you have the speed + backup...

lol.. its just a thought, i have never worked with raids.

RAID5 is definitely not in anymore. I haven't seen someone successfully recover a degraded RAID5 array in the last 2 years and a few of my friends have lost their entire arrays. The problem is the recovery process. When the dead drive is replaced the remainder of the drives are thrashed for a full day+ to recover from the degraded state. Similar drives being of similar Mean Time To Failures will tend to fall over during this thrashing. RAID6 addresses this providing 2 parity drives, but at that point you may as well go RAID10 so only half the recovery is needed.


Just another thing I forgot to mention, look at motherboards with features like Intel Matrix RAID controllers. These feature partition level RAID support and I do something like this on my Linux NAS box with software RAID, the OS system is disposable and runs on a RAID0 parition for speed, and the storage partition is RAID1 so if I need the data quickly I can just pull out the good drive and stick it in another computer.

A RAID array of SSDs would be nice, but damn expensive.
 

Most reactions

New Topics

Back
Top