Monthly Archives: April 2014

Follow the cold data center model?

One of the interesting things that came out of an interview I did with the Wall Street Journal was that someone else, in this case Michael Casey, picked up on something that few do – there is another operations model that most companies don’t consider when thinking about a data center strategy – especially for the upper realm of power (and cooling) densities which is a ‘follow the cold’ model.

What we discussed was the problem that Bitcoin Mining companies will have in the Northern Hemisphere as summer approaches – more heat. Mining is great when it’s cold out because a rig can help heat 1,000 square feet. So you’re making Bitcoin AND heating your living space and getting a double benefit because what you are paying for electricity to mine will also cover heating costs because the rig is turning electricity into heat just like any other computer. Now that winter is over in the Northern Hemisphere and Summer is on the way, that formula isn’t very attractive.

That same electricity that is making Bitcoin is now a liability because the last thing you want is more heat when it’s 100 degrees outside already. So you need to pay for electricity to cool your living space AND to make sure the rig doesn’t get wonky because it goes outside its operating temperature. You’re paying double for power to do the same thing. So how does a miner normalize things?

One simple and pretty basic way to do it is to follow the cold. Ship rigs to a place in the Southern Hemisphere and plug them in so you get the heat benefit when the climate is ideal for it. The things to figure out will be cost of electricity, any language barrier issues, and security of the facility and/or the operator. You don’t want to ship an expensive rig or handful of rigs to a place that sells them out the back of their remote mountain top hideaway because he doesn’t have electricity to begin with.

I still think the way to mine, and make money, is to only do it at scale and move off of air cooling. It’s too expensive when you run hot machines. Look at liquid cooling like the solution from Allied Control that has already deployed it for a mining operation in a challenging environment – Hong Kong. Then you can stay put, mine, and make money – as tempting as shipping your rig to an exotic cold spot for the Winter down under might be…


Tagged , , ,

The Summer of Bitcoin – Hotter Than You Think

Last week I had the opportunity to spend some time with the folks at Allied Control to talk about immersion cooling some more, the impact to Bitcoin and ultimately the impact on computing globally. One of the interesting things we discussed was what was just starting to be FELT by miners and data center operators – the heat problem.

You see, when Bitcoin mining started getting a lot of press mentions (before MtGox, Silk Road, and the other tabloid-esque stories), and was starting to take off, it was October of 2013. That was 6 months ago. Since then the hash rate has grown 46 TIMES. Not 46 percent, but 46 TIMES in just six months. That is a lot of hardware being built and shipped to support the growth of something that is still being defined as often as it is reviled but still has grown faster than anything else in the past 5 years. Given that level of growth, which by the way isn’t sustainable, there will be some plateaus, cliffs, and other negative trends that will impact the Bitcoin ecosystem. The most worrisome is heat.

The reason heat is so worrisome is because heat is a cancer on the roots of the ecosystem.

Heat eats away at profits in the form of cooling costs, it spreads as the Bitcoin ecosystem spreads, and if not dealt with aggressively, it may not kill the ecosystem but it will kill a lot of the ecosystem’s food chain because it takes more money to cool the rigs than the rigs can make mining. The more hardware put into service adds more heat to the ecosystem. Dealing with the amount of heat put into the ecosystem is breathtakingly inefficient today, and the density of most computer equipment is 10X less than a Bitcoin rig. One 4U Bitcoin rig generates 2.5 Kw. That was the electrical footprint of an entire 42U cabinet just a few years ago. Maximizing the number of rigs that fit into a 42U cabinet is 10 rigs and draws 25 Kw per cabinet. And unlike computer equipment that ebbs and flows with elastic traffic, workloads and users, Bitcoin rigs get turned on and they go. All gas, no brake until they get replaced. So you pay to cool 25 Kw per cabinet equivalent the moment you plug the rigs in and turn them on.

Going to a data center company to host your rigs is an expensive proposition. The cooling systems in most data centers that are older than 3 years (99% of them) cannot effectively cool that concentration of heat effectively. Even modern facilities can cool it, but a premium. Want to mine in the cloud? Forget it. We talked to an animation/CGI company a year ago and they build their own data centers because going to the cloud meant a 900% increase in cost for 30 Kw cabinets, and most of that was the cost of spreading out the horsepower used by those 30 Kw cabinets to too many cloud servers. Why? Heat. The 30Kw cabinets run really hot.

It is like trying to blow into the nozzle on a heat gun to keep your face from burning at that density.

So Bitcoin mining is not the only business where heat is an issue, but they are the newest, and least experienced. Bitcoin has largely grown up in the company of cowboys and 20-30 somethings mining on a rig that also doubled as a heater this winter, which was an efficient use of electricity. Since computers turn electricity into heat, if you are going to run a rig and get the extra benefit of heat along with Bitcoin, then it’s like mining Bitcoin and getting heat as a bonus. Now that these folks need air conditioning to keep their apartment cool, the added heat from a mining rig makes it VERY expensive to mine because electricity is now needed to cool the room and the rig, and the rig is adding heat on top of the heat and humidity that comes with summer, and you pay for the electricity to mine and to stay cool, so you double down, or in business speak, double your costs which will be variable based on climate. That’s a lot of risk added to a risky business.

The issue with cooling technology to date is air is nearing the end of its useful life. Air cooling got us to where we are 30 years after the first data centers were built to deliver 25 watts per square foot and cool it effectively. We are WAY past that today. In fact 3 years ago, you were boasting saying you built a facility to 250 watts/foot, and even that is not dense enough for Bitcoin. Every high density facility will tell you they can cool a Bitcoin footprint, but tell them you want to put 400 racks in a 10,000 square foot room, with an average rack density of 25 Kw that will draw that 24x7x365 from the money you light that rack, and see how they respond.

You are asking them to put 10 Megawatts in  what has traditionally been a 2.5 Megawatt space and cool something that is 400% hotter than their best design can handle. And that is just for the rigs that will ship in the next 90 days. So yes, Virginia you can cool Bitcoin rigs with air cooled systems, however not at a cost that is palatable for the operator or the mining pool, or even worth it today. Using air cooling will make it virtually impossible to make money mining – the chips in the rigs are too hot to be cooled with air.

So we need another technology better than air- liquid cooling.

There are two types that  have emerged as solutions in the market – one is oil and the other is manufactured fluid, like Novec from 3M. In a data center application oil is messy. It clings to boards, it drips, it gets on and soaks into anything that it comes into contact with. In videos I have seen touting its awesomeness, I cannot see a server tech wanting to put on gloves to go add a server to an oil tank. The other things I saw in the videos were the fact that the tanks were on raised floor and CRACS were in the background. Why cool the ambient air with air conditioners? Someone is paying for that no matter how much you are saving. The other issue with oil is that you still need to leave the heat sinks on the boards, and periodically take them out and scrape the paste buildup off of the heat sinks, and then put them back using NanoFoil and so the whole process is time consuming, messy and expensive.

The Novec liquid immersion cooling solutions, like the Immersion-2 option from Allied Control, I think holds the most promise. The liquid rolls off the boards and they come out dry. No dripping on my shoes, the floor, or ruining my clothes. It does not require a massive chiller plant to make it work, the cooling can be dialed up or down to cool whatever density gets thrown at it. It’s non-toxic. A tank is a tank, so as densities increase, the method of cooling remains constant, as does the equipment, solution and operational tasks associated with the new method. Once you go liquid, there is one system to learn and it’s low maintenance and you can fit 200 boards in a 42U equivalent cabinet. That is density air cooling and cabinets can’t manage effectively.

I think the headwind ALL of the liquid/immersion cooling solutions will have is with hardware manufacturers. Using liquid cooling negates the need for fans, cases, and cables for the components inside the box. That cost is stripped out, and while logically it makes sense to want to lower costs, if those costs translate to margins then it slows adoption. The perception is that the hardware manufacturer loses money. The reality is that the hardware manufacturer can maintain price because a customer’s perception that they are paying more for a stripped down model with no case, etc. is quickly smashed when they get their first data center bill and it is 50% less to run more powerful gear. I am willing to bet that the hardware manufacturers that want an integrated stack which consists of their lowest cost/highest performance hardware, in an environment that is free to cool their hardware in will win out. Or the cloud provider that wants to cool their more powerful hardware without paying for the electricity to do it that gives them a competitive advantage over EVERY other cloud provider because the others will be paying for electricity to move air around their data centers – on top of the cost of the chillers, piping, CRACs and CRAHs needed to do it.

So as things get hotter for Bitcoin, it will drive more of them to find a cost effective solution to cool their hardware that gets hotter and hotter, because paying for the power required to cool a rig is going up along with the temperatures. Summer is coming and it’s getting hotter already…

Stay cool, and look at liquid cooling.

Tagged , , , , ,

Math of Bitcoin – IV

In this installment, I wanted to throw out some reference data for people trying to size their heat signature so they can make better decisions about where to host rigs for Bitcoin mining.

1 Rig = 4U

1 Rig = 2.5 Kw

1 Rig = 8,540 BTU

1 Rack of rigs (8 rigs) = 20Kw

1 Rack of Rigs = 69,000 BTU

1 Ton of cooling for every 12,000 BTU

So for one (1) rack of rigs, you’ll need 7 tons of cooling

400 tons of cooling per MW (355 Tons for 1.2MW) in most facilities

Figure .6 Kw per ton to run a chiller so 240 Kw per 400 tons

So for 1.2 MW/10,000 sq feet of space you can put no more than 57 racks in or 456 rigs per MW

So 912 TH/s is the hashrate per MW using Cointerra TeraMiner