Most of you know that we're running an ethernet based SAN here at BCM. It's the Maxx Digital Final Share SAN which runs a combination of their drive arrays with Small Tree ethernet wizardry and some stout Atto Host Bus Adapters. We generally get around 100 - 120MB/s to the 7 workstations that are connected to the SAN. More than enough speed to cut using Apple's ProRes codec all day long.
At the moment we're cutting two feature documentaries (over 300 hours of material), 3 PBS series and a multitude of other projects all on the SAN simultaneously. It's been a very stout performer and when we need to do Uncompressed or 2k work, we have two local 8TB RAIDs directly connected to two workstations giving us 500 to 650MB/s. So for our needs, we've got everything set up to handle whatever comes in the door and need to allocate the uncompressed workstations as necessary.
Well at NAB 2010, Bob Zelin brought me over to the Maxx Digital Booth to show me 350MB/s coming off a single 8TB RAID connected via ethernet.
Now we're getting into Uncompressed HD territory. Albeit a single stream of Uncompressed HD, but that's perfectly fine for color grading or sound mixing and even editing. Not to mention serious multi-stream ProRes capabilities. Via Ethernet!
I knew this technology was going to get better as we moved along but I didn't expect a 250MB/s jump in just one year. In fact, the system could go faster but we need to wait for the drive manufacturers to catch up! Can you say Multi-stream Uncompressed HD and 2k via Ethernet on the horizon? All I can say is Wow.
Really looking forward to working with these new speeds once everything is ready for shipping. I'll update you all as details become available.
We've added a second 16TB Evo2 HD unit from Maxx Digital to our machine room bringing the total storage to around 60TB for the facility. This unit simply daisy chains on to the original unit making installation incredibly easy. Bob Zelin assisted in ensuring the RAID was set up correctly and in less than 10 minutes, the new RAID was building.
We continue to add more storage in support of the multiple long form, documentary projects currently in production. Our next step will be to either upgrade or replace our original 8TB EVO units which are going on 3 to 4 years old now.
In December of '08 we installed a Maxx Digital Final Share SAN system consisting of a 16TB array that is shared to 6 workstations via high speed ethernet connectivity. You can read a full article I wrote on the installations at the Creative Cow website.
The primary purpose for this installation was to allow a shared editing environment for three feature length documentaries. We have in the neighborhood of 450 hours of footage (and growing) for all three and our first doc, Foul Water, Fiery Serpent is using around 100 or so hours. All footage is digitized at Apple ProRes 720p/60 via Apple's Final Cut Pro so we're using a very low bandwidth format for the edit.
We started really cutting on the project in March of '09 and as I have reported both on the Creative Cow website and here on my blog, it has for the most part, been a thing of beauty. I'm cutting on a Mac Pro / AJA Kona 3 workstation while my edit assist is cutting on a 21" iMac. We broke the doc into 9 segments to make for easier project management and to allow each of us to work on different segments simultaneously.
To give you an idea of the size of this project, we have between 2,800 and 3,600 raw video clips, over 100 music cuts, animations, graphics and voice tracks. So we're in the neighborhood of 4,000 to 4,500 media files for this project. That's as big a project as I've worked on yet. As I said, the SAN has worked great during the editing process.
However, in the past few months I believe we have found the limits of ethernet based SANs; playback of a large project timeline. In November we finally had a full 98 minute timeline cut of the entire documentary. I could not play the entire timeline without dropping frames. And not just once, it would drop frames multiple times, every time during playback of the timeline. Plenty of speed on the RAID (about 600MB/s or more), plenty of speed on the network (about 100MB/s) but for whatever reason, I was dropping frames throughout the timeline.
Now I don't believe the length of the timeline is an issue. In my testing I played a 30 minute episode of "Good Eats" in a continuous loop for 3 hours on multiple systems simultaneously. So the system can easily sustain a long playback cycle across multiple systems, let alone on a single Mac Pro workstation. No there has to be something else other than pure speed.
If you've read my blog entries you also know that we've been dealing for months with an ethernet port issue introduced by Apple with the latest Mac Pros that caused the network disconnect from my Mac. So this problem of dropped frames was thought to be part of the same issue. Over the Christmas Holidays, with help from Small Tree Electronics, we dealt with that issue finally by moving the SAN to the Snow Leopard operating system because Apple finally created a fix to the disconnect issue in the latest updates.
But our dropped frames remain. The SAN is running as fast as ever, but we're still dropping frames during playback of our 90 minute timelines. From what I can gather, we are the only facility running this large of a project off this type of ethernet SAN. All the other facilities are doing 30 minute or shorter programming and a lot of :30 to :60 spots. In our own shop, we have multiple workstations doing projects of 20 minutes or less with no problems.
How we're dealing with these dropped frames right now is to export a self contained movie to a local 8TB array that is connected directly to my Mac Pro. This is how I screen the film for the client or folks who come in for reviews. It's the only way I can play the film without it stopping.
So for whatever reason, it appears to be the sheer project size and the amount of files that the project has to access during playback of the timeline that seems to be the issue. It really shouldn't matter, but it does appear in our real world application, the system simply does not support playback of an extended timeline from a project with this many files.
It's a real shame because the system is performing incredibly well overall, we have two series being cut on it and I've been able to work with a 2nd editor simultaneously on editing the documentary. But if you can't play back your main timeline without dropping frames on a large project, well then the system is not made for all editing applications as I originally thought and was led to believe when I made the purchase.
So if you're working on shorter projects, episodic television, 1 hour projects and need to share media across multiple workstations, this is still a killer deal. Fibre Channel is still the only alternative and you can't come close to what this system does for the money. We will continue to use this system moving forward on most of our projects.
But for the long form stuff like these documentaries, I'm going to invest in a few more local 8 and 16TB arrays. The primary workstation for each documentary will have its own dedicated local storage and anything that needs to move across to other workstations, we'll push to the SAN. It'll make things a little less efficient for the really REALLY big projects, but I'll have the best of both worlds. Low cost SAN for 90% of our projects. High speed local RAID for the documentaries.
So would I still recommend an Ethernet SAN for you? Absolutely, but go in understanding the limits and make sure it's right for your application before you buy. I won't say that this system was a $20,000 mistake, but I would have spent my money a little differently 12 months ago had I known this system would be limited by the documentary. And we all know technology improves almost daily so with any luck, future improvements will allow this type of SAN to even support the really REALLY big projects in the future.
After some suggestions from colleagues, I mixed down the audio tracks (we had a total of 24) in the timeline and attempted a full timeline playback. We got 38 minutes through the timeline before it dropped frames, but it did not drop frames again. It was a 1 hour 18 minute timeline. So that's progress. Not exactly efficient since it took a while for the computer to do the mix, but it's an improvement.
Today's just been one of those days where the Final Share SAN isn't working right, the AJA Kona 3 isn't working right and FCP isn't working right either. Just as we're about to finish our first feature documentary.
It's one of those days where you wonder what the hell you invested all of this money for and if you should just switch over to something else....
To paraphrase a famous author, "I have seen the future of shared storage and his name is Ethernet."
Last month we invested in the new Final Share
system from MaxxDigital
and after some tweaking, we now have 16TB of shared storage supported a high definition workflow with 6 workstations all running Apple's ProRes HQ in high definition, both 720p and 1080i. And actually it's not "workstations" in the traditional sense of the word, since we're running ethernet, we can connect any Mac computer to the array.
So in our case, we have three Final Cut Pro desktop workstations and three iMac's all connected. In our testing today we configured the three FCP workstations to capture approx. 3 hours of 720 and 1080i ProRes HQ material each. As that was happening, all three iMacs were playing back 20+ minute clips in Quicktime Player in a loop. After all the capturing, we had all three FCP workstations set up with 90 minute timeline playing in a loop while the iMacs kept playing their clips. We left it all alone for several hours and all way still playing. 6 streams of high definition from one storage array and all via simple ethernet cable!
We plan to use the iMacs both to allow Producers to review footage immediately upon capture and also for Assistant Editors working on upcoming series. Once the footage is in the system, anybody can access it at any time and since it's not Fibre Channel, I don't have to invest in top of the line desktop editing systems for the assists.
Watch for a full article on this system coming up shortly, but wow, this thing really works and it's really affordable!
Ok, late news flash as this information is actually from MacWorld (what two months ago now?!?) but the folks at ATTO were able to crank up two MaxxDigital EVO HD SAS/SATA units to 1200MB/s. 1200!!! Good Golly Miss Molly that's some serious horsepower.
Ok, ok, ok, how did they pull this off since you can't connect two systems to the same card? Easy, two ATTO R-380 SAS/SATA cards in a Mac pro. Then connect a MaxxDigital EVO HD unit to each card. Stripe both EVO HD units as one large storage array and presto, 1200MB/s.
Hmmmmm, how many streams of video would THAT be? You could do the Brady Bunch Open times 5 at least I suppose. Would be fun to test that out with Uncompressed HD and see if there's any way to do multiple streams of HD in realtime with filters.
Here's a photo only a mama could love. A bunch of tech geeks standing next to the unit with the speed tests displayed on the screen.
Yet another reason I love these guys. They're products are really fast, they work and you get this incredibly fine looking support staff to help you out. Ok, maybe not fine looking.... sorta geeky looking support staff...... maybe in a creepy sort of way......... but they know their stuff......so just pretend you don't know what they look like.........maybe I should just shut up now.