Alaska can attract big data centers
Alaska engineer Sid Huhndorf says North Slope should be getting hyper scale data centers, a win-win for them and us Kay Cashman Petroleum News
The following research paper, The Space Race for a Cleaner and Greener World Wide Web, is the first such document published in Petroleum News.
It was written by Sid Huhndorf, a 25 year old Alaska oil and gas engineer, who thinks the mega data centers making their way to states such as California, Oklahoma, Virginia and Ohio, would make a lot more sense on the North Slope for the owners of those centers.
“My primary goal with the research paper is to get the conversation going with Alaskans and their representatives. … We have the hydrocarbon resources, thermal resources, land resources, and industrial resources that these other states do not,” Huhndorf said.
The paper “specifically addresses the economic and technical upsides for both parties involved in the implementation of data centers in Alaska,” he said.
Most data centers worldwide are in some of the hottest places on earth, which Huhndorf said leads to the “seldom discussed issue of energy consumption to cool these massive facilities that provide consumers with reliable and fast internet access.”
For example, Amazon Web Services, “houses a majority of its data centers in the contiguous United States south of the +40 line of latitude,” he said, whereas arctic environments offer unique and plentiful sources of energy, such as “hydroelectric and natural gas that are difficult and costly to transport.”
Using “modern heat pump technology, any extra heat that is produced from the data centers can be used to heat the data center facilities” in the colder months.
As for the benefits to Alaskans, Huhndorf said there is a budget shortfall and “we still have no good solution to the 35-plus trillion standard cubic feet of natural gas sitting on the North Slope with nowhere to go and a promise from a country that has no skin in the game.”
He offers suggestions to state government to get Alaska to the “bleeding edge of this space race.”
The Space Race for a Cleaner and Greener World Wide Web By Sid Huhndorf, Engineer
DARCY Analysis & Modeling LLC June 2019
1 Abstract
A great majority of data centers worldwide are located in some of the hottest places on earth. This leads to the seldom discussed issue of energy consumption to cool these massive facilities that provide consumers with reliable and fast internet access.
A prime example of this is Amazon Web Services which houses a majority of its data centers in the contiguous United States south of the +40◦ line of latitude [12].
Currently there is a space race of sorts to relocate a majority of these data centers to arctic locations for two reasons:
1. The reduction in cooling energy costs and carbon footprint which helps greatly with public image.
2. Arctic environments are home to very unique and plentiful sources of energy, such as hydroelectric and natural gas that are difficult and costly to transport to in-demand markets far away.
People in Alaska have found themselves with two major issues. We are in the midst of an extremely serious budget crisis which has recently seen a rash solution that will lead to future costs to the state economy in the form of people leaving the state and infrastructure, services, etc. not adjusting fast enough. The second issue that we still have no good solution to is the 35+ trillion standard cubic feet of natural gas sitting on the north slope with no where to go and a promise from a country that has no skin in the game [3].
This proposition report is comprised of high level technical and economic studies, which will explore and support the move of data to Alaska’s Arctic.
For Alaska to get itself to the bleeding edge of this space race, the State must hire a task force of economists, engineers, accountants, and business professionals to produce a turnkey contracts package (once signed money starts flowing and construction begins) for potential data center owners who are interested in investing into their bottom line, their public image, their contribution to the global good, and on top of all of that, who want to be the first in the United States to pioneer Arctic Data.
This task force would evaluate the bottom line and environment cost of keeping these data centers in hot locations vs. the pay off period when located on the north slope utilizing natural gas fired power generation.
This new revenue stream into the State of Alaska would allow for a very sudden injection of external funding into the state for the initial construction of the date center facilities which should be more than enough to make a meaningful impact to the nearly $2B state budget deficit we currently find ourselves in as well as diversify our external revenue stream (the long fought battle to get some of the eggs out of the crude oil basket).
Finally, using modern heat pump technology, any extra heat that is produced from the data centers can be used to heat the data center facilities during the winter months as well as the colder summer months.
2 Introduction
It is clear that every year more and more of the technological marketplace is transitioning from traditional utilization, to being tied in with the worldwide web in some regard. This can be from the watch on your wrist to the phone on your desk. Among other things, this increasing trend suggests that the energy demand for the data centers which support the weight of the worldwide web will continue its upward trend as well. There is no fancy method to stop or even reverse the increasing energy demand for all this data; but there are smarter ways of spending the energy invested into the worldwide web.
From an energy standpoint, data centers are simple entities. The energy demand of a data center can be split into 4 distinct categories [13].
1. IT equipment (servers, harddrives, and data transceivers)
2. Backup power supply (Uninterruptible power supply)
3. Cooling Systems
4. Lighting
Of these four categories this study is going to focus solely on the cooling system employed to a data center. Of the four categories this is the category that the State of Alaska has the most to offer.
A distinctive feature of data centers is their physical size. There are many many different sizes and scopes of data centers that are possible. Currently the majority of data centers are less than 2000 sq. ft. and of that majority approximately 30% are data centers that are less than 100 sq. ft [13]. This means that the majority of data centers within the United States are smaller than a standard conference room. For the most part these small data centers are the property of and are used by small city governments, small colleges and universities, government buildings, small businesses, etc. The bottom line is that despite the energy inefficiency of all of these little data centers, it’s simply not worth the time and energy to attempt to implement efficiency solutions on a case by case basis.
Instead this study will explore the efficiency solutions that Alaska can offer for the hyper-scale data centers which are classified as data centers that are exceeding 400,000 sq. ft [13]. These are the traditionally massive factory-like facilities that most think of when data centers are brought up. In spite of the fact that hyper-scale data centers are still the minority of all data centers the trend in the last 10 years strongly suggests that this won’t be the case for much longer as data services (web hosting services, cloud storage, cloud software, etc.) such as cloud computing nudge the market in the direction of hyper-scale data center consolidation [13].
3 Technical Proposal
When it comes to efficiently cooling an area as large as a basketball gymnasium that is being heated by tens of thousands of industrial sized servers all running at max speed there is a whole new realm of research and science that becomes involved.
Luckily, engineers have worked this issue for more than 50 yrs and in those 50 yrs the most cost effective and thermally efficient technique for cooling facilities such as a hyper-scale data center has historically been demonstrated to be utilizing CRAC (computer room air conditioning, Figure 1) units in conjunction with evaporation cooling units using either water as a working fluid or refrigerant as a working fluid in the case of a high humidity environment.
These CRAC systems are simple, reliable, and efficient when it comes to removing vast quantities of thermal energy 24/7 from these hyper-scale data centers.
In terms of data center cooling CRAC systems such as Fig 1 are as standard as wheels on a car.
The focus of what Alaska has to offer to the industry doesn’t have anything at all to do with re-inventing the CRAC system design. Instead it focuses on providing access to a much larger and nearly 100% annually consistent thermal sink as well as access to the largest proven natural gas reserves on the whole West Coast of the United States [1].
The idea is fairly straightforward and intuitive. Instead of continuing the environmentally unfriendly trend of locating these massive data centers based on the cheapest land grab with access to grid power, we instead start locating these data centers on the North Slope of Alaska where the average annual temperature is the coldest the United States has to offer and the natural gas has no market to go to. There are 5 primary technical drivers that drive the North Slope of Alaska to the top of the list for becoming a leader in environmentally friendly Data Center implementation.
1. The North Slope would provide responsible and environmentally compliant use of the Beaufort Sea as the thermal sink for cooling purposes.
2. The North Slope would provide a nearly unlimited quantity of natural gas which currently has no market. Natural gas, despite being a fossil fuel, is extraordinarily clean burning with absolutely no particulate output and very little greenhouse gas emission (50% less than coal) [15].
3. In the past 5 years Alaska was tied into international fiber optic through the company Quntillion who successfully ran fiber optic all the way to the North Slope of Alaska where it is in use today through the TeleCom company Alaska Communications Services. Due to the success of this one tie in there is very obvious potential for any needed upgrade such as simply running more fiber across the state along the same route [11].
4. The North Slope can provide access to the use of vast commercial and technical resources such as contractors, equipment, supplies, man-camps, etc. The North Slope is essentially a 10,000 sq. mile work site in which any possible commercial resource can be made available.
5. Environmental compliance and safety standards on the North Slope are absolutely world class and can give the data center public image an incredible boost for operating in such a safety driven atmosphere.
3.1 Location
In terms of location on the North Slope the sky is the limit but there are some stand out locations that can provide unique physical access as well as access to the available energy resources.
At right is a map of the North Slope of Alaska which depicts a prime location range for a data center.
The location range proposed in Fig 2 is optimal for data centers due in part to its proximity to all of the resources available in the Prudhoe Bay area, near the Beaufort Sea so as to use as thermal sink, and next to the Trans Alaska Pipeline system pump station 1 for easy access to North Slope processed natural gas.
3.2 High Level Facility Design
With respect to high level facility design it won’t be much of any different than how any other facility is designed on the North Slope, other than the fact that it will be a data center rather than a production facility or a pump station. The Arctic Engineering will be the same as what is tried and trusted currently to protect the integrity of the sensitive tundra and permafrost from heat pollution and physical harm. For the sake of clear illustration of what is being proposed here, below are illustrations which lay out a high-level concept of the facility and implementation.
At a very high level, Fig. 3 illustrates accurately the layout and function of a would-be hyper-scale data center at the shores of the Beaufort Sea. The natural gas power station would be receiving its supply of natural gas from all producers who opt to offtake from their resources and input natural gas into the pipeline system. The power station would then produce the required power for the data center, which in the case of Amazon Web Services is 31.68 MW [7][13]. From there the waste heat produced by the 80,000+ industrial servers are then rejected to the Beaufort Sea via a thermal distribution grid (Fig. 4) so as to greatly reduce the probability of noticeable and significant habitat damage [7].
The thermal distribution grid will be sized as a function of how quickly heat can be rejected and dissipated in any given volume of the Beaufort Sea.
The idea behind it is to determine the rate by which thermal energy can be dissipated to the sea without the energy ‘pilling up’ in a given volume thereby raising the temperature of that given volume of water and disturbing the Beaufort Sea habitat in a negative way.
Other than the thermal grid doing its intended job well enough that no harm comes to the Beaufort Sea habitat through heat pollution; its other critical feature is that it functions as a very low maintenance system. The best way to reduce maintenance frequency is to reduce complexity of said system as much as possible. Figure 4.(a) and 4.(b) illustrate a simple and effective proposed design for the grid.
One important aspect of the piping the grid is comprised of, is it will be heavily insulated to minimize quasi-uncontrolled thermal seepage into the Beaufort. The only exposure the rejected thermal energy will have with the sea will be through the precisely distanced and milled discharge orifices.
As figure 5 points out the discharge pipeline that transfers the heated sea water back to the thermal distribution grid will be placed on thermopiles for a sufficient distance to not expose the sensitive tundra to any harmful levels of thermal energy.
All concepts that have been presented in this section of the proposal are very high level and generalized, therefore in no way represent the final turnkey product intended.
4 Economic Proposal
All new technical ideas proposed are great and pie in the sky up until the point they are put under the harsh microscope of an economic feasibility study. For the sake of this proposal we’ll run through some ‘back of the napkin’ economics to feel out the economics of such a proposed project.
The first thing that should be considered is the approximate size and power distribution of an Amazon Web Services data center. For obvious trade secrecy reasoning this information is impossible to come by in exact numbers, but some tech journalists have narrowed Amazon data centers down to an approximate number and size.
According to longtime tech journalist Rich Miller with DataCenterFrontier.com Amazon Web Services has stated in company presentations; sizes hyper-scale data centers at somewhere between 50,000 and 80,000 servers per data center. [7].
For such a rough economic dive into a North Slope data center, this rough range is a perfect starting point. For the sake of argument, we’ll move forward with the assumption that if this were to come to fruition Amazon would go all in and implement a standardized data center on the high end of the estimate with 80,000 servers.
In terms of geographic space required, Rich Miller confirms in his article [7] that the third party developer for Amazon (Corporate Office Properties Trust) has stated that the two contracted sizes for Amazon are 150,000 sq. ft. (388’x388’) or 215,000 sq. ft. (464’x464’). A reassuring side note on these stated footprints; despite sounding big, these footprints are in fact quite small relative to nearly all drill pads on the north slope which average between 400’x400’ to 700’x700’ some even exceeding 1000’x1000’ (1,000,000 sq. ft.).
Now that we have a verified number of servers to work with, we can now build from the ground up the approximate power demands of a North Slope hyper-scale data center.
At the ground level we need to know the power demands of one single commercial industry server that Amazon Web Services would most likely use. Luckily there has been a study.
In 2016 the Berkeley National Lab released a report detailing the findings of a 16 yr study done on Data Center Power usage under the funding of a U.S. Government contract. The report happens to be 65 pages long and contains a majority of erroneous information with respect to this proposal. With that said there are useful numbers presented that will be leaned on heavy for this proposal including their findings on the power usage for one single commercial industry server.
Data center servers have two distinguishing features; they are either 1 socket servers (1S) or they contain 2 or more sockets (2S+) and they are either distributed by large computer brands such as (Dell, HP, IBM) or they bypass these manufactures in the supply chain and go directly from manufacturer to corporation [13]. Generally, it is only hyper-scale size data center companies who order unbranded servers from the manufacturer.
From the Berkeley National Lab report, the Standard Performance Evaluation Corporation using the Server Efficiency Rating Tool (EPA rating) estimated in 2013 that for both (1S) and (2S+) branded and unbranded servers the averaged maximum power draw is 330 watts [13]. This 330 (watt/server) figure is an average of many servers with varying socket counts and manufacturing paths (1S, 2S+ Branded, Unbranded).
In addition to this key ground level power consumption number the Berkeley National Lab report also released findings on the Infrastructure (facility) power usage as well. It is presented as a metric referred to as the Power Usage Effectiveness Metric (PUE) which is the total energy required by the data center in relation to the energy needed for the IT equipment. A data center with a PUE of 1 would use zero electrical power for anything other than the IT equipment [13].
From this PUE table provided in conjunction with the number of servers from Rich Miller and the averaged maximum power draw for a single server (Table 1) it will be straight forward to deduce the approximate power demands of an Amazon Web Services data center.
The first thing we can do is calculate the total power demand of an 80,000 server data center like so.
Total Power Demand (MW) = (330 W)*(80,000) * (1+0.02+0.16+0.02) / 1,000,000
Total Power Demand (MW)
= 31.68 MW
One single hyper-scale data center for Amazon consumes approximately 31.68 MW of power. Of that 31.68 MW of power 16% of it is dedicated solely to cooling systems as per Berkeley Lab report PUE table. This implies that this Amazon data center consumes (0.16*31.68MW) 4.224 MW of power for only the cooling systems. To put this number into perspective; according to the Idaho Public Utilities Commission [5], 1 MW of electricity is enough for 650 residential homes which implies that 4.224 MW is enough power for over 2,700 residential homes – spent only on cooling one single data center.
From this point we can now evaluate a base case cost of cooling for this data center so far. Assuming this data center was one of Amazon’s 3 data centers in northern California, we can use the average cost of electricity for the industrial market in California which is $0.1049/kWh as per ElectricityLocal.com [9]. Moving forward with this number we can calculate a base case cooling cost per day for this data center located in the northern California location like so.
California Cost per Day ($) = 4224 kW * $0.1049/kWh * 24 hrs = $10,634.34/day
We know that so far there are no Amazon data centers using any water source as a thermal sink; the cooling systems reject heat directly to the atmosphere which is where the primary economic advantage of using Beaufort Sea water comes into play. To roughly calculate the cost savings of using Beaufort Sea water vs. atmospheric air as a thermal sink we can simply take the ratio of their respective heat capacities like so.
Heating capacity of seawater (kJ/kg*C) = 3.993 [6] Heating capacity of air (kJ/kg*C) = 1.158 [16]
Under the assumption that the working fluid is pumped with the intent of transferring thermal energy at the same rate as the base case we can calculate the energy saved using seawater by simply taking the ratio of heat capacities due to the fact that the thermal sink (the seawater) can accept the thermal energy in that much more of a quantity from the working fluid allowing the working fluid to not have to be pumped as fast.
Cost savings of Seawater per Day ($) = $10,634.34 – ((1.158/3.993)*$10,634.34) = $7,550.30
This rough calculation implies that by simply using seawater from the Beaufort Sea as a thermal sink rather than air from California one single data center would yield $2,755,861/yr in savings. That is money going straight into the pocket of Amazon; consider it a stream of passive income. We’re not even to the natural gas section of the economic analysis.
Now we can factor in the use of zero transportation natural gas power for the data center. In terms of hard-line economics the following proposal is the serious selling point.
We in Alaska currently are sitting on 35+ trillion cubic feet of natural gas that is not only NOT making us money, its costing us money with respect to the fact that it needs to be re-injected back into its native reservoir using massive re-injection stations across the North Slope [3].
Because the data center would be located with untapped natural gas right in its back yard there would be zero transportation costs tacked onto the gas, which is currently making the gas unprofitable to produce from the reservoirs. By using the natural gas for energy right where it comes out of the ground the transportation cost problem is completely eliminated and the natural gas becomes competitive on the market. This fact by itself is very appeasing to any company looking to implement an energy hungry fleet of facilities. The next garnish to the proposal simply makes the deal irresistible.
Due to the fact that Alaska would have the corner on this niche market, it would be in the best interest of Alaska to provide a client like Amazon an irresistible bonus incentive that will make the energy package truly turnkey in nature. Instead of selling the natural gas to Amazon at a fair and current market value we sell the natural gas at a price no one in America can compete with for a set project payoff period (100% return on investment). Currently our closest neighbor, Washington has the lowest industrial electricity price coming in at $0.0471/kWh [2].
If we instead provide Amazon electricity from natural gas power stations at $0.04/kWh for a payback period of 10 years with no “ifs”, “ands”, or “buts”; there’s simply no possible way any client could say ‘no’ to such a sweet deal.
This scenario would yield the following savings for Amazon and the following gross gas sales revenue for Alaska.
Cooling Cost per day @ proposed price ($) = 4224 kW * ($0.04/kWh) = $4,055.04/day
Include the cooling cost savings from using the seawater as a thermal sink and you have a final cooling cost per day on the North Slope of:
Final Cooling Cost per day for Amazon Data Center ($) = $4,055.04/day * (1.158/3.993) = $1,175.99/day
Final Cooling Cost Savings per day ($) = $10,634.34/day - $1,175.99/day = $9,458.35/day Final Cooling Cost Savings per year ($) = $9,458.35/day * 365 days = $3,452,297.87/yr
Full Fleet Cooling Cost Savings per year ($) = $3,452,297.87/yr * 16 = $55,236,765.95/yr Full Fleet Cooling Savings 10 yr period ($) = $55,236,765.95/yr * 10 = $552,367,659.50/yr
This suggests that for one single Amazon data center located on the North Slope Amazon would be guaranteed $3.4 million in savings income per year. If Amazon were to commit and move their whole fleet of 16 hyper-scale data centers to the North Slope, they’d be in for $55.2 million in savings income for just one year and a grand total of a guaranteed $552.4 million in savings income for the payback period of 10 years.
Don’t forget; we’ve only covered the savings income to Amazon related to the cooling proportion of their power consumption. Now let’s evaluate the affect a fixed natural gas energy price would have on the savings income for Amazon.
Using the same evaluation approach as above, we’ll start with one data center for one 24-hr period and extrapolate up in terms of fleet and time size.
We can quantify the overall savings income of fixed energy price and cooling like so.
Income Savings of Fixed Energy Price and Arctic Cooling Per Day ($) = ((31,680 kW – 4224kW + (4224kW*(1.158/3.993))*24hrs)*($0.1049/kWh - $0.04/kWh) = $44,673.51/day
To give that number a bit of perspective; Fred Meyer took in $1.8B in the 2018 fiscal year for its 134 store locations [17]. This suggests that each store location took in an average of $36,802.29/day in sales for FY2018. The income savings for Amazon per day of this proposed move would exceed the gross sales of a Fred Meyer store location.
Table 2 below extrapolates the total income savings per day figure out to the fleet and out to 10 yrs.
Before we move on to the gross revenue of gas sales for Alaska, let’s quickly review the magnitude of these numbers for Amazon. When it comes to ’Income Savings’, it can be thought of as net profit to Amazon because it is operating expense that can be invested elsewhere, therefore growing the company. In relation to the net revenue of Amazon, this $2.6B figure is almost equivalent to what Amazon net in 2007 and 4% of what Amazon net in 2018 [14].
Now that we’ve covered the potential proposed income for Amazon, let’s evaluate the gross revenue that will be possible for Alaska.
At $0.04/kWh and with a power demand of 31.68 MW per data center, we have what we need to determine what guaranteed gross income Alaska would find itself into for this particular deal.
For one single data center on the North Slope, the gross income to the State of Alaska would be:
1 yr Gross Income for Alaska Gas @ $0.04/kWh ($) = ((31,680 kW – 4224kW + (4224kW*(1.158/3.993))*0.04/kWh*24hrs/day*365days/yr= $10,049,819.50
10 yr Gross Income for 16 centers ($) = $10,049,819.50 * 16 * 10 = $1,607,971,120.66
This means that we’d sell our natural gas which is currently costing us money for a guaranteed lump sum that nearly exceeds the current FY2019 Alaska deficit assuming the deficit doesn’t grow. The guarantee of 10 years of predictable state revenue is almost worth more than the money itself.
The next natural question is, “Would Alaska be selling itself short by trying to lure clients in with this fixed low-price approach?”. The answer to that question is not absolutely certain but what is certain is the percentage of our reserves that we’d be selling for this fixed price.
A standard industry estimate of natural gas consumption is that 1 standard cubic foot of natural gas will produce 0.29kWh of electrical energy [10]. Using this figure, we can determine the natural gas offtake required to satisfy the 10 yr payoff period for the data centers, like so.
Gas sold for 1 yr and 1 center (scf) = ((31,680 kW – 4224kW + (4224kW*(1.158/3.993)))/0.29kWh/scf)*24hrs
* 365 days = 866,363,750.36 SCF
Gas sold for 10yrs and 16 centers (scf) = 866,363,750.36 scf * 10 * 16 = 138,618,200,057.00 SCF
At first glance this 138.6 billion scf of gas appears to be a gargantuan volume of gas. But if you recall from the abstract, Alaska currently has 35+ trillion standard cubic feet of natural gas reserves costing the state and operating companies money. For sense of scale:
Percent of Proven reserves on North Slope (%) = (138.6E9/35E12) = 0.39605% of Proven Reserves
The State of Alaska would only be selling a mere 0.39605% of its currently proved reserves of natural gas to lock this deal in. Whether or not that is considered to be ‘getting taken advantage of’ I don’t know but I do know that this deal would guarantee a new injection of revenue into our archaic revenue sources.
Finally, we have to consider the fact that oil and gas exploration companies have intentionally avoided natural gas discoveries due to the cost that will be incurred for dealing with a resource that is too expensive to move. The 35+ trillion standard cubic feet that are proved out were stumbled upon by exploration companies over the past 50 or so years. Imagine if companies were looking for natural gas on the North Slope? Assumptions for the potential resources available on the North Slope that remain yet undiscovered are at 200 trillion standard cubic feet. [3]. There is a lot of energy that is too expensive to move and a lot of land that is uninhabitable to people.
Sounds like a recipe for a data center metropolis to me.
5 Conclusion
We’ve now covered both a general overview of technical implementation as well as a general analysis of how much money both parties would be making with this deal.
It is clear that on both the technical fronts as well as the economic fronts of this proposal, not all factors are being considered, but I hope it goes to show that a deal like this so long as it’s implemented as safely and environmentally compliant as possible would be incredibly lucrative for both Amazon and The State of Alaska.
An additional source of revenue that wasn’t even considered but is also significant are leases of land for the data facilities as well as the power generation facilities from the respective owner.
Another significant consideration that must be mentioned is the fact that this proposal only focuses on Amazon. There are other huge entities (Google, Netflix, Microsoft, Apple, Federal Government, etc.) that have their own data centers which are also in thermally inefficient areas of the United States.
I’d like to end by saying that the only worse move than making the wrong move is making no move at all in a time of need; and we Alaskans are in a time of extremely serious fiscal need with the current means to help ourselves. I see no better time to focus the eyes of the world on us once again as a leader in fossil fuels.
6 Closing Thoughts & Discussion
The overall scope of this technical and economic proposal is shallow in depth and has limitations that must be addressed.
At the forefront of the limitations to this proposal is access to credible capital investment information regarding mega projects such as the aforementioned construction of hyper-scale data centers as well as the construction of natural gas fired power facilities. As a consequence of this limited access to credible numbers on these projects they were not factored into the final results of the economic proposal in Section 5 of this proposal.
Next in line for limitations that directly affected this proposal is the available time that was invested into research, drafting, and editing. I am confident the depth of research would most definitely be magnitudes greater if the quantity of free time to invest was available.
Alternative facility locations is not something that was explored to any extent in the technical or economic portions of this proposal. There are two reasons for this decision; in the traditional spirit of a proposal it is inappropriate to present alternatives to the base case as if they are equals. The second reason for not exploring location alternatives is the simple lack of available time to invest into research for this proposal.
7 Suggestions for Continuing Research
If an individual such as myself or someone else wished to take the research path presented in this proposal further, I would suggest the following.
It would be lucrative to the credibility of this proposal to continue to increase the depth of research into the economic aspects of this project. A very important economic factor that was purposefully left out of this proposal due to lack of credible sources, is the capital investment required to implement a natural gas utility on the North Slope that would provide the hyper-scale facilities with the required natural gas for power.
Integrating the upfront capital investment required to implement the hypothetical natural gas utility on the North Slope would greatly improve the depth of scope for the economic portion of the proposal.
In the case that said individual desired to drive the direction of this research from a proposition style to more of a scientific style, in which all alternatives are presented in equal context to one another, it would then be appropriate to delve deep into the topic of alternative locations within the State of Alaska.
A good example of an alternative location for the facilities within Alaska would be Tustumena Lake located deep within the Kenai National Wildlife Refuge on the Kenai Peninsula of Alaska. This alternative location would provide the data center facilities with direct access to an already established natural gas utility; Enstar Natural Gas. Placing the facilities at Tustumena Lake would also provide the facilities with a similar thermal discharge resource as the Beaufort Sea. Tustumena lake is glacier fed from the Tustumena Glacier and is therefore reasonably temperature consistent throughout the seasons. The greatest barriers to the success of this location would all be political and social in nature.
I’m confident there is enough here in this proposal to get a conversation going in regard to the future of the arctic and where Alaska stands in that future; my preference is that Alaska is in the front of the line leading the future of the Arctic.
8 Bibliography References
[1] U. E. I. Administration. U.s. crude oil and natural gas proved reserves, year-end 2017, 2018.
[2] U. S. E. I. Administration. Electricity data browser, N.Y.
[3] A. G. D. Corporation. Alaska’s natural gas supply, N.Y.
[4] Google. Prudhoe bay, Alaska, N.Y.
[5] Z. Hagadone. How many homes can you power with a single megawatt?, 2015.
[6] W. E. Lynne Talley, George Pickard and J. Swift. Physical Properties of Seawater, chapter 3 Physical Properties of Seawater, page 33. Elsevier Ltd., 2011.
[7] R. Miller. Amazon plans epic data center expansion in northern Virginia, 2017.
[8] N.A. Data center raised floor, N.Y.
[9] N.A. Residential electricity rates & consumption in California, N.Y.
[10] T. Oven. Energy conversion, N.Y.
[11] Quintillion. Quintillion completes installation of historic Alaska subsea fiber optic cable system, N.Y.
[12] A. W. Services. Why cloud infrastructure matters, N.Y.
[13] S. Shehabi. United states data center energy usage report, 2016.
[14] Statistica. Net revenue of amazon from 1st quarter 2007 to 1st quarter 2019 (in billion u.s. dollars), 2019.
[15] UCSUSA. Environmental impacts of natural gas, N.Y.
[16] I. Urieli. Specific heat capacities of air.
[17] M. Watch. Fred’s inc., N.Y.
9 Appendix List of Figures
1 CRAC Unit Function diagram illustrating Hot/Cold Aisle arrangement [8].
2 Map of Prudhoe Bay Showing Prime location for Data Center [4].
3 Generalized Layout of Data Center Facility and Support Systems on
North Slope.
4 Generalized Look at Thermal
Distribution Grid.
5 Placement of Discharge Pipe above Tundra on Thermopiles.
List of Tables
1 Berkeley National Lab report PUE table for data centers with respect to size [13].
2 Rough Income Savings Extrapolation cases for both length of time and quantity of data centers.
|