That movie took an immense amount of computing horsepower. And really demonstrates, at least to me, the extent to which network workloads are really beginning to split into two.
When it’s put onto DVD, the movie will size up to something like 5 Gigabytes. Requested by 250,000 technology executives (or more likely, their kids) on a quiet evening – for display on their new IP TV’s or mobile handsets, or through their set top boxes – that’s going to put some burden on the network. Big data (5 Gb), relatively small compute (decode, check for authorization, etc.). Caches will help, but it’s a classic parallel throughput problem.
Which is an entirely different workload than that required by the banking customer with whom I met last week – who was worrying about Sarbanes-Oxley compliance, and the introduction of provisioning technology to manage and audit risk. Running risk analytics over large data warehouses – big compute, small data (nowhere near 5Gb).
The industry keeps trying to solve both problems with the same systems. I’m not sure I believe that’s a useful pursuit – it seems more true every day: there is no one hammer for all nails.