Introducing Project Blackbox


As I’ve been saying for a while, our customers – more specifically, a segment* of our customers – face a diversity of tough challenges. What does the CIO in midtown Manhattan do when she runs out of roof space or power? How does an aid agency deliver basic connectivity to 5,000 relief workers in a tsunami stricken metropolis? What does an oil company do when they want to move high performance analytics onto an offshore platform or supertanker? Or a large web services company do when they want to cookie cutter their infrastructure next to a hyrdroelectric plant for cheap power – within weeks, not years?


None of these are easy problems to solve – especially one computer at a time. They’re more commonplace than you’d think across the globe. And now you know the motivation behind our asking a simple question, “what would the perfect datacenter look like?”


Improving upon its father, the traditional datacenter, it’d have to be more space and power efficient. Very high performance, and designed for machines, not people with plush offices. It’d have to be available within weeks, not years. And portable, to allow customers to deploy it anywhere – in a disaster area, or next to a hydro generator.


But let’s start with the most basic question. How big would it be?


In the world of vertically scaled, or symmetric multi-processing systems, pools of CPU’s share access to a common set of memory. But the size of a given system has a physical and logical limitation: it can be no bigger than the private network used to connect all the disparate internal elements.


But the future of the web is clearly moving toward horizontal or grid computing. In a grid, a conventional network is used to connect collections of smaller*, general purpose elements (like Sun’s Niagara or Galaxy systems). The question of “what’s the biggest grid?” has no obvious answer – they can be as big as you want. Just as at TACC, where they’re building the largest supercomputer on the planet out of general purpose elements.


So a while back, we asked a few talented systems engineers a simple question: is there an optimum size for a horizontally scaled system? Interestingly enough, the answer wasn’t rooted in the Solaris scheduler or a PhD thesis. It was rooted in the environmental realities faced by the customers I cite in the second paragraph. And perhaps more interestingly, in your local shipyard.


Shipyard?




The biggest thing we could build would ultimately be the biggest thing we could transport around the world – which turned out to be a standardized shipping container. Why? Because the world’s transportation infrastructure has been optimized for doing exactly this – moving packets containers on rails, roads and at sea. Sure, we could move things that were bigger (see image), but that wasn’t exactly a general purpose system.


So the question at hand became, “how big a computer can you build inside a shipping container?” And that’s where the systems engineering started.


First, why are servers oriented in racks and cooled by fans front to back? To maximize convenience for humans needing to interact with systems. But if you want to run a “fail in place” datacenter, human interaction is the last thing you want. So we turned the rack 90 degrees, and created a vastly more efficient airflow across multiple racks. And why not partially cool with water in addition to air – if you burn your hand, do you wave it in the air, or dunk it in a bowl of ice water? The latter, water’s a vastly more efficient chiller.


A non-trivial portion of an average datacenter’s operating expense is the power required to chill arbitrarily spaced, very hot computing platforms – vector the air, augment with a water chiller, and cooling expense plummets. As does your impact on the environment. Did I mention the eco in eco-responsible stands for economics? For many companies, power is second only to payroll in datacenter expenses. (Yes, the power bill is that big.)


And that’s how we started to go after power efficiency.


Second, if you can generate power for less than the power company charges you, why not do so – put a generator next to the chiller in a sister container, and you’ve got access to nearly limitless cheap power. (Heck, you could run it on bio-diesel.)


And if power rates or workload requirements change and you want to relocate your container – good news, the world’s transportation infrastructure is at your disposal. Trains, trucks, ships, even heavy lift helicopters. You can place them on offshore oil rigs. In disaster areas. In remote locations without infrastructure. To wherever they’re most needed.


Finally, in most datacenters I vist, I see more floor tiles than computers. Why? Because operators run out of power capacity long before they fill up their datacenters – leading them to waste a tremendous amount of very expensive real estate with racks spaced far apart. In a container, we go in the opposite direction – with plenty of power and chilling, we jam systems to a multiple of the density level and really scrimp on space. And it can run anywhere, in the basement, the parking garage, or on a rooftop. Where utilities, not people, belong.


With a ton of progress behind us, and enough customer interaction to know we’re on to something, that’s why we’ve unveiled our alpha unit, and gone public with the direction. We’ve done a lot of detail work, as well, working to integrate the container’s security systems into enterprise security systems. It knows where it is via GPS (you can locate them via Google Maps, if that’s your bent). Sensors know if the container’s been opened or moved. We’ve even done basic drop tests (one, accidentally) to deal with transportation hazards (the racks inside can handle an 8g impact!). And we’ve explored camouflage options, too (you really don’t want a big Sun logo screaming “steal me, I’m full of RAM!” on customer units).




Every customer we’ve disclosed has had a different set of concerns or challenges. None in my mind are insurmountable. But we don’t have all the answers, of course, that’s why we’ll be working with key partners and integrators (one customer wanted the container to detonate if it was breached – er… perfectly doable, just not something Sun would do).


At a top level, we know there is no one hammer for all nails.


But in this instance, there might be one blackbox for all of network computing.


Specs and details to come – and in the interim, here are some great photos and usage scenarios (I especially like the Mars Rover companion – that was Greg’s idea).


____________________________________


* more on this later.

52 Comments

Filed under General

52 responses to “Introducing Project Blackbox

  1. Larry

    Detonate? Gives new meaning to ‘lights out’ management. πŸ™‚

  2. doing this “one customer wanted the container to detonate if it was breached – er… perfectly doable, just not something Sun would do” probably one nice thing to do. Any video demo?? πŸ™‚

  3. Bob

    Will this be available for Try and Buy? πŸ™‚

  4. My bet: You started Project Blackbox right after reading Cringely – Right? However I like much more than stealth projects in underground parking garages in Mountain View πŸ˜‰

  5. Mike

    I wonder if there will be a separate version with heating in addition to cooling.
    Otherwise in our climate it won’t be possible to start that thing until spring comes πŸ™‚

  6. Great concept. I can think of all kinds of great uses for Blackbox — not to mention a great advertising idea I’ve been kicking around for nearly ten years. I just blogged about it — and am offering the idea to Sun, free of charge.

  7. heyrobertdavis,
    We seem to think along similar lines, except I thought taking the blackbox with me would demonstrate it’s powers a bit better: Marketing Sun’s Project Blackbox. I particularly like the cruise ship part of the idea…

  8. Hi Jonathan,
    I am a big fan of your site and your openness. But I have to say I have the same thought as Rainer posted. I also read the same Cringely article “The Google Box” in Nov 2005, almost one year ago. Can you or your engineers tell us how much better your Container Box is better than the *current* Google Box please?
    Cheers,
    Kempton

  9. Sun never stops amazing me with their ability to “think outside of the box”! Like ZFS and UltraSPARC T1, you always manage to re-define things, you always seem to not restrict your innovations by conventional thinking. Well done.
    Jonathan, your blog is almost as addictive as someone else
    coffee.

  10. I can’t resist; here’s my idea for a slogan for the Blackbox:

    “The Sun Blackbox, a datacenter in a box: Just Add Water*

    *…and power, and Internet connection.”

  11. powlow

    Really amazing looking and revolutionary. Can you bring one to Barcelona? πŸ™‚

  12. Sanjay Balram

    Jonathan, forget the ‘nay’ sayers. I agree that Sun is on to something big. The simplicity and scalability of this idea is simply awesome. I live in India. As India transitions to a developed country and grapples with infrastructure problems(which naturally will take time to build) this can help us surmount many of them.My mind swims with the myriad possibilities for this idea. Here’s wishing you and your team great success.Hope Sun’s Blackbox becomes the IPod equivalent to the IT world – ubiquitous and a revolution!

  13. AWESOME! It is products like this that make me proud to work for a company who continually think outside the box! Add solar voltaic panels on to the roof and a wind turbine (for units located out-side…) and you have an even greener unit!

  14. Ahsan

    It will be interesting to witness on-site repairs to the [i]Perfect DataCenter[/i], especially in the rooftop scenario with engineers dangling from a chopper. I am sure it will be covered live by most news networks the first time its done πŸ˜‰

  15. Next to the Blackbox I have another idea for a product Sun could offer.
    See The Schwartzbox

    Bye,
    Oliver

  16. Gary A. Ross

    Great idea! It really drives home the point of the datacenter being much to large. But it also forces a hard look at scalability. You state that the web is clearly moving towards horizontal scalability. I agree, to a point. There are now, and will continue to be applications that just run the best on a big honkin’ box.
    The issue of horizontal or vertical scalability is still largely a matter of the application mix, not necessarily hardware. I hope Sun is working hard with ISV’s to either port or move applications to Solaris. I also hope you have designed this so that it can fit the odd E25K or so in the mix if needed.

  17. Queen’s ‘I want to break free’ hits the sweet spot with Project Blackbox.
    Now what I am going to say maybe whacky.
    Idea #1
    ——-
    Seriously, cooling can be improved by another few percent (at the very least) with this particular design. Just lay all this inside another container, the space between the two should be filled with SAND or something similar which when wet holds some heat. This container instead of just doing shcok absorbtion, also does this dual purpose function. A group of sensors would monitor when the sand dried off and pump some water again.
    There is a project pioneered by a African guy who sells a cheap ‘refrigerator’ replacement by packing food inside of two earthern containers. The space between the two containers is packed with mud and the water in between the two contains allows food to be held fresh for a factor of minimum 3.
    http://www.treehugger.com/files/2006/08/mohammed_bah_ab.php
    Idea #2
    ——-
    If Blackbox is integrated with a geothermal heat exchanger (google for ‘earth heat exchange’) the cooling will be improved today. This is not rocket science, people! It would need the container to be located on the ground.

  18. Larry Hunter

    Remember when computers built with discrete component transistors gave way to large scale integration? Deja vu all over again, in a larger large scale.

  19. Am I the only one, or do they look like giant Legos(TM)? Actually, why not make them like giant Legos? If you want to add capacity, snap a new data center block on to the existing data center. You could go vertical or horizontal. The connectors would contain all the needed power, cables, cooling, etc. Literally plug and play.

    Of course you could then make various standard sized data centers. There could be half length data centers and 1/3 height data centers. This would allow people to create data center artwork, or just to grow a data center in a non-traditional way. A growing data center could adapt to its physical environment or even to a changing environment. It would be easy to reshape the data center to meet other space needs.

    Sorry for the attempt at humor, but your explodable data center comment got me thinking about data centers in a whole new way. But isn’t that the whole point?

  20. Austin

    Looks Good , Looks Practical , Looks Ruggidish , Looks Cost Effective , Looks Eco Responsible.
    But , I am unconvinced about the Cooling required for it , So are you telling us , take this Container any where on earth in any climate and just the cooling provided by Air and Water would suffice ?

  21. Stupid Idea

    First the Grid, now Blackbox. It just shows the poor judgment of the management at Sun.
    Stopping throwing darts to see which one sticks. Concentrate on operation.
    If one thing you should learn from Mark Hurd of HP, it is OPERATION. No fancy new ideas that waste financial and human resources. Just do the simple things right. That is the simple formula for success.

  22. JustAnotherJerk

    The photos around this rollout seem to have been
    heavily photoshoped. i.e. the multiple containers
    in the warehouse, the container going into some relief
    situation etc.
    It seems like you’re trying to show “possibilties” not
    reality. But it seems like it creates the illusion that
    the system has been tested/deployed more than it is, which
    I suspect is the desired illusion.
    It would be nice if photoshopped pics were labelled as
    such. Then you can you do what you want, and the reader/consumer
    gets full disclosure. win/win.
    Also: this blackbox stuff seems to remind me of things the
    military has done. I thought they were drop-shipping data center
    in a box things back in the punch card days (with push out sides
    even, like big RVs nowadays).
    I have my doubts on whether you’ve shake-and-bake tested a loaded
    box enduring the transportation scenario you have in your pics.
    (off-road etc)
    (i.e. it just works, when it arrives at destination)
    JAJ

  23. stmok

    Very interesting idea…Wouldn’t the military be interested in something like this? (We hear of all this “digital battlefield” stuff. I suppose its very convenient to carry a few of these containers on C-5 or C-17s on deployment).

  24. Mark McKenzie

    It’s really really REALLY nice to see SUN go in the right direction with this product, it makes me feel proud to have worked for SUN.

  25. Pablo Ruiz

    With Sun Blackbox, forget about “rack-ready” devices. There is going to be a paradigm shift, we need to start thinking in term of servers/storage/UPS/Network gears that are “container-ready” or maybe even “blackbox-ready”. Way the go Sun. Keep the innovation coming…

  26. John H Silver

    Interesting niche product – but then the real payback is selling the contents not the box! But I couldn’t help smiling about the water cooling – takes me back to the old mainframe days! πŸ™‚

  27. Dmitry Khrustalev

    Well, my understanding is it’s water cooled ( needs chilled water)? Does it need UPS/conditioned power? Can you connect me to someone in project team, we are trying to solve simular server farm scaling issues, it can be interesting.

  28. This is a fantastic idea! Does it have a UPS built in?
    If it had enough of a UPS built in to run entirely off Solar power (not as inconceivable as one would think), this could be a great step in getting the internet to places it’s never been before.

  29. JEnger

    Whats to stop me from bringing the truck ’round one night and nicking your datacenter?
    Which is a serious possibility, because I want one of these beauties! ;o)

  30. Prince

    JOnathan, You are doing a very good job of advancing SUN’s good work through the blog. As a share holder i appreciate that.

  31. Jason Fordham

    The blackbox is an ideal platform for bringing together communities which already accrete around software: there’s a business to be made filling stadiums with MMRPG players, where a blackbox on tour hosts the environment. The big screens can show realtime rendering of scenes from the game.

  32. Jak

    So what are your plans for all that energy being pumped out of this box?
    You ought to at least be able to heat your pool – or better yet turn it back into electricity and re-charge your Tesla.

  33. Hon Hwang

    Yep, I can see how the military might be interested in this kind of things. I saw a documentary made by PBS a while ago about U.S. Army’s Combat Support Hospital. One of the segment of the documentary shows how the surgical theatres are shipped in shipping container like boxes and they fold out. Maybe these Blackboxes can be used in this way (for computing needs on the field).
    Another usage I can see is in entertainment. These Blackboxes goes with a band to concerts and tours and stuff and they can be used as massive storage spaces to digitise the concerts and stuff. Another idea is portable outdoor cinema – HD quality movies, projectors, speakers and everything into boxes.
    Heck, send them into the moon and we can truly have a “off-site” data storage – or a new module for the space station.

  34. I think Haliburton would be interested in this for sure. Here in the Southwest spacial limitaion is not yet a real problem but the idea of portability is fantastic especially if you are operating out of a leased structure!

  35. It will be interesting to witness on-site repairs to the [i]Perfect DataCenter[/i], especially in the rooftop scenario with engineers dangling from a chopper. I am sure it will be covered live by most news networks the first time its done πŸ˜‰

  36. TranceMist

    I think you should offer a companion product: The Bingebox – a shipping container full of vodka.

  37. Anonymous

    In the 80’s I was assigned as a SIDPERS chief with the 101st airborne. We had a system mounted in a semi-trailer with a connection to the building we used. In the event of a deployment, the plan was to drop the cables from the building, hook up a tractor, and away we would go with all our data on board. The blackbox is not a new idea, but still a good one.

  38. Tom Rogowski

    Jonathan, I’m a former Sun datacenter employee. Before my life at Sun I was a U.S. West Coast fleet manager for a container leasing company and one of the first internationally certified container inspectors. I like the idea of Project Blackbox but wonder how much thought has gone into the environmental problems a container encounters. A normal steel container will get to well over 120 degrees inside in the summer. I hate to see how hot a black one will get. I got a laugh out of the picture of a blackbox in the parking garage. You can get one there with some work but very few garages have clearance for a truck. The one in the picture certainly doesn’t. Containers are moved primarily on ships and handling damage is common. What are you going to do when that container receives a good dent, or worse, a cut? Repairs to the container are going to necessitate removing all the equipment. Computers don’t do well around welding torches. Even the most well-locked container can be broken into. All you need is a cutting torch. A container in tropical areas is highly susceptible to insects. You wouldn’t believe the ones I’ve seen and had to fumigate. You’d get a first hand look at Grace Hopper’s original definition of “bug.” Like I said, I think it’s a good idea, I just hope the team is seriously looking at where these boxes may go and how they’re getting there.

  39. Anonymous

    try getting that throught customs. By the way advanced encryption tech is export restricted.

  40. ir123sh

    Many of these blog posts seem just a few months or years behind. The military has had this for at least 15 years. Not just a data center but operational capabilities as well; all in tractor trailors (they’re mobile on their own!): one trailor for the data center, one trailer for cooling, one trailer for operators with consoles & printers, and one trailor for documentation (yes, the idea is that old). All hooked together in minutes when they park.
    I think they stopped using it when satelite/wireless became an option! These things are the same as everyone having their own datacenter at the department level. Why would you ever do that when you have broadband capacity of amazing speeds today, the ability to buy storage/processing as a service by someone who has professionally hardened, secured and backed-up the facility.
    What happened to the network is the computer?
    I guess we’ll all watch to see what happens. I’m sure the Sun team has done the market research to know what the clients want and they want gaggles of these babies.
    Love the idea of the data center container in the disaster zone. Plugged into what? Cooled by what?
    I apologize for the poor spacing and readability of my comment. Seems this blog does not allow wysiwyg. You must know html syntax.

  41. responder

    I find all the comments about heat somewhat funny. How do you think food gets moved around the US? Never seen a refrigeration unit like a Thermoking on a trailer before or just never bothered to pay enough attention.

  42. Hi Jonathan! Long time since the BEHEMOTH days… cool to see what you’re up to. I’m in the process of compressing the Microship lab into a 40-foot insulated container so it can be redeployed essentially anywhere (I was getting very sloppy in 3000 square feet of funky pole building anyway <grin>). Containers are great, even beyond their obvious modularity… built like tanks, great ground planes for roof antennas, built-in RFI shielding, and CHEAP on the used market.
    I agree with the previous commenter about repairability; I’ve been to one of the yards in Seattle that deals with these, and they are handled very roughly, banged around whilst hanging from cranes… and need to be fully accessible for welding repairs (most of which are on the lower long edges). So internal modularity is important (as well as first-rate shock isolation).
    Hope to cross paths again sometime… I should be down that way in Nomadness sometime around June.
    Cheers! -Steve

  43. Sam Perth

    Now you can really move your datacenter to the arctic, like the CIO in Sun’s spoofs. All you’ll need in addition to your container is this: http://mocoloco.com/archives/003274.php an all terrain cabin, for the lucky admin.

  44. Kevin Hutchinson

    You’re a busy man so I expect you don’t remember me emailing you half a year ago talking about “zeitgeist”. But this kind of Black Box datacenter innovation demostrates that you and all the good guys at Sun right now have “the zeitgiest”. Let’s have dinner sometime.

  45. Valencia Poulty

    You go Jonathan – Keep up the good work—-Sun has more cool stuff happenin’ than all the other players combined.

  46. Rob Goodson

    Well, ya may want to make a hefty donation to the American Red Cross for using the internationally recognized trademark which your marketing folks used in the mockup photos. I know for fact that it’s been aggressively defended as a trademark for years. That being said, I’d personally love to get some Sun equipment into the local chapter here in San Mateo county, but that’s a pretty big goof up to have made.

  47. [Trackback] La noticia geek del d&iacute;a es la presentaci&oacute;n de Sun Blackbox, un gadget m&aacute;s que interesante.
    Los chicos de Sun han empezado a pensar inside the box. Mientras todos los fabricantes intentan miniaturizar los componentes, ellos se pr…

  48. Bob Duignan

    a. It’s been done before. Plenty of times. Most Cell base stations have a container with the equipment for example (perhaps not as dense, but the principle is the same). mil at it for decades.
    b. The examples given are moronic, and clearly not thought through…You guys must be smoking good weed!
    – Seismic modelling on one rig? why not get the telemetry via satellite and process it centrally rather than sending a super computer to each site (a very costly proposition – land value on a rig is higher than central tokyo, you’d have to chopper it out). Computer + Sea air = does not compute!
    – national power grids have been around for nearly a century, what benefit does placing a datacentre beside the power source do – this from the company that claims the network is the computer, very PC…
    – Why would you need 10,000 clients in a disaster zone where people are starving? They can blog about what its like to starve? I’m sure they would appricate shelter, food and medicine far more….
    -Placing it on a roof ain’t that simple – how does it get there, will the roof support it, how to maintain if a part fails (who fixes it!)With that many parts I’d be interested how many service calls are needed to replace parts. I know from mobile phone operators (who already do this) that this is very tough.
    All in all, it’s been done before, perhaps some limited sales for datacentre overflow, but not a heck of a market. Back to the drawing board!!

  49. Abhishek

    project blackbox could have survived if you could have packed some computing power in a decent size suitcase and priced it at around approx 1000USD.
    People who can spend 500,000 USD and afford the transportation and shipping of that beast can as well spend a little more and try to home grow a datacenter based on traditional datacenter concepts that is proven.

Leave a reply to Ben Pintilie Cancel reply