Five years ago Creative Liquid moved to a fully networked post production environment. In the years since our initial implementation, technologies have changed, software has updated, and we’ve moved offices! We figured it was a good time to revisit this topic and provide an update on how we’ve taken our original solution outlined in our Network Rebuild: EVO Review article and update it for 2021.
What’s the same
- We still use a 10 gigabit high-speed copper network for connections between edit suites, workstations and our networked storage devices.
- We also still have our original EVO, which even after 5 years is still going strong.
- We still primarily edit in Apple’s Final Cut Pro X (FCPX) software, and Davinci Resolve for color grading
- We’ve moved to a bigger office!
- We have moved to a full 10 gigabit network vs the original direct connection approach
- We have added more storage and introduced data redundancy across our project lifecycle
- We have updated our workstations
In September 2018 Creative Liquid moved to a larger office in downtown Alexandria, VA. The move allowed us to re-imagine our network and IT infrastructure entirely. Our new office had been split into two separate workspaces, by floor originally, so our first task was to plan out our use of the space, and to determine where we wanted to run, re-run or reuse networking cables.
Our new office has 5 offices, a conference room, a Studio, an equipment room, a server room, and an open workspace area with four workstations. We planned for each ‘person’ to have 1x10GbE and 2x1GbE ports (at a minimum). We ran some extra lines in places for flexibility in the future, like our Edit 1 suite has fully redundant runs, (just in case!)
Since we use Cat6a cabling for our 10GbE connections and this is not commonly run due to costs, we had to run all of our Cat6a lines. We also ended up re-running Cat5e to the first floor since it had been severed in the ceiling when the office was split. We ended up removing 400lbs of old cabling from the ceiling and recycling it in the process of tracking down all the network trunks. (Note: If you find yourself in possession of large piles of copper cable check your local recycling options, we got a few hundred dollars back!)
We ended up with 16 10GbE (Cat6a) runs and 78 1GbE (Cat5e) runs, and I can now patch a panel or a wall port in my sleep. For more information on the cabling types, and capabilities check out our original article on Network Rebuild: EVO Review.
Since we were adding potential editor and user seats, all needing access to the EVO Network Attached Storage (NAS) device over a 10GbE network we determined we needed to add a 10GbE switch. It also helped that the price of 10 gigabit switches has come down substantially since our initial implementation, when we originally determined it was not worth the expense in our smaller office.
We opted for a Netgear XS712Tv2 with 12 ports. The XS712T is a ‘smart managed switch’ so it has the ability to manage some network settings and ports. Initially we continued to use direct connections to EVO for key workstations (Edit Suite 1, Edit Suite 3, and used two of EVOs 10GbE ports for the switch in a Link Aggregated (LAG) interface shared between lower priority systems (Logging system, Edit Suite 2, Offices etc.). After a bit of testing and verification that the data transfer speeds were not negatively affected, we transitioned all four 10GbE ports on EVO into a single LAG port on the switch. For the past year or so, all users on the 10GbE use the same ‘pipe’ to work on EVO.
I joined Creative Liquid in March of 2016 as the CTO, coming from a background heavy in IT and coming from a Federal Agency, my work experience has been structured around procedures, policies and planning. After we settled into our office, the CEO, the Post Production Manager and I started to meet to discuss our data handling, data lifecycle and archival procedures. We formalized existing processes and created a standard operating procedure for data, documenting how media from the field is handled when it comes in, where it goes, how it’s stored, how long it’s stored, and how it eventually is archived. During this process we identified key areas where we could improve our handling of data, and reduce risks to the company in case of system failure, or disaster.
The first issue we wanted to address was how we were utilizing EVO. We have 64TB of storage, which was plenty when we were working with HD video, but over the past few years we have moved nearly entirely to 4k. This jump means a project that used to be a few hundred gigabytes is now several terabytes. Since EVO was a sizable investment, we wanted to ensure we had room for all our active work. This led us to add a new NAS to the network, referred to often as our ‘Nearline’ storage. Nearline is used for ‘near term’ archival of projects, and a redundant storage location for all media cards.
The goal was to provide a place for projects to live that were inactive, or complete, that was accessible over the 10GbE network, less expensive per gigabyte, but maybe not powerful enough to support editing. We selected a Synology RackStation (RS2418+) with 12 drive bays. We initially fitted it with 6 12TB drives, we were able to add 42TB of usable storage over a 10GbE network for about $2,700.
Note: We have tested using Nearline to edit projects, and despite the lower end hardware, it performed remarkably well in HD editing tests. The performance and features of the Synology fall short of EVO, but at about 1/10th of the cost it performed far better than expected.
Creative Liquid has been using LTO tapes to archive projects since 2012. Two copies of each tape are created, one kept in the server room, and one kept off site. This covers completed projects, and nearline keeps a separate copy of our Media cards, but Nearline sits directly below EVO in the server room, and while this provides redundancy in case of hardware failures, we determined that it left us open to a host of other possible scenarios that could lead to data loss.
We researched a few options for off site backups, including Cloud Storage, AWS Snowball and LTO Tapes. While cloud computing has taken over, we found it to be unsustainable for Video Production for a few reasons:
- Internet Speeds: At the time we first reviewed this option, we were operating on a business cable internet connection. The city of Alexandria does not have commercial fiber optic internet, and the jump to a dedicated fiber connection was cost prohibitive for a small business. Our original speeds were about 10Mbps up maximum. We performed some scheduled tests and determined that it would take about 30 hours to upload 100GB. In 2017 we processed about 30TB, which would take, uninterrupted, 382 days to upload at that rate.
- Cost: Cloud storage and computing is a fantastic shift in the industry. However, with the pay-as-you-go model, you can quickly run up the bill! For AWS S3 Glacier Storage, at $0.004/GB per month we were looking at an annual cost of about $1,500 for that 30TB of data, and since we keep our archives for several years, that compounds each month as you increase.
We determined that our archival needs would cost us about $19,000 over five years using AWS Glacier ($107,000 in S3!). Compared to the cost of LTO tapes over the same period, which was calculated to be $2550 (originally based on $85 per LTO7 tape, they have dropped down to $53.95 per tape at last purchase). We settled on LTO tapes, and upgraded to 1Beyond ThunderTape deck, using YoYotta LTFS.
For our active projects, we took some inspiration from AWS and built our own Snowball, using a G-SPEED Shuttle device, we named Sluice in keeping with our liquid based naming convention. Sluice comes into the office every other week for a full clone of EVO, and then returns to it’s off site storage location. Sluice uses some custom scripting to sync changes from EVO’s volumes so each bi-weekly backup is based on changes being copied, and unchanged materials being left alone. The total process generally takes about 2-3 hours each time, and can run in the background over the 10GbE network.
As a production facility using FCPX, we are pretty much exclusively using Apple workstations. In the last few years we’ve seen some exciting, and overdue, updates to editing workstation. Our 2013 Mac Pro (Trashcan) served us well, but was beginning to show it’s age when working with large multi-stream 4k projects.
We’ve since updated our two primary edit suites to iMac Pros, which along with beefy hardware and gorgeous displays, came with integrated 10GbE ports. Our Mac Pro is still useful, serving as a logging and archival workstation, and running virtual events from our HQ during the pandemic of 2020.
With the increased availability of 10GbE adapters integrated into Apple's new Pro machines, we’ve been able to move away from our ATTO devices. The ATTO Thunderlink devices run about $1,000 a piece, and require special drivers. Moving to native network interfaces was a blessing from the IT management perspective. Our older Thunderbolt 2 Macs still use the ATTO devices, but our new edit suites and our new Thunderbolt 3 laptops are using significantly cheaper and easier to manage adapters (OWC Thunderbolt 3 Pro docks (MSRP $299), and OWC Thunderbolt 3 10G Ethernet Adapters (MSRP $199) have helped increase the accessibility of our 10GbE network for more users.
Creative Liquid has grown a lot in the past five years since we last shared our environment. We continue adding new capabilities, increasing the performance, and solidifying our processes so our team can focus on what they do best, telling your story, and less on the day-to-day operations of the underlying IT systems that keep everything reliable and smooth.