लड़ते लड़ते जीना है, लड़ते लड़ते मरना है
- Aug 10, 2020
The data centre has become more like a system, a meta-system. It has become the infrastructure that houses data and algorithms and ensures quality service for apps. IT has become more complex but also more efficient. Two major predictions did not come true, however. The data centre was predicted to disappear into pubic clouds but that did not happen. It continues to thrive in public clouds, on-premise and at the edge where its growth will be significant. It was also expected to be standardised around a small number of technology choices, but it has become more heterogeneous, more specialised and more customised. And Without question, the major driver of data centres today is artificial intelligence workloads. AI applications are enabled by a virtuous cycle. It is cheap to collect massive amounts of data, there is hardware and algorithms that can make sense of that data and need massive amounts of it, and there are economically viable uses for the resulting insights and automation.
IoT is leading to edge computing where data centres of different sizes and capabilities are necessary. There was a time when you expected a smartphone on one end, a public cloud at the other, and nothing in between. Instead, what we see emerging is a giant fabric of resources from the sensor to the cloud. In this fabric, IoT is the fountain of data. The vast majority of data will be generated and ultimately consumed outside the cloud because the information contained in every ‘thing’, whether a consumer product or a building or a city or a ship, will be harnessed, so we can manage, predict, automate, control and so on. The proverbial volume, variety, and velocity of the data make it too slow and expensive to ship it all back to the cloud. It necessitates several stages between IoT and the cloud. And at every stage, you must decide whether to process, store or transmit the data. And that means more data centres everywhere.
Data centres have substantially caught up with virtualisation where possible and bare metal where necessary. The prevailing model is scalable capability with a cloud consumption model, hardware configurations that ensure the quality of service, and advanced development tools that ensure developer productivity. One big lesson of AI is that you need a lot more data than you thought you did. On the face of it, that means data superiority is necessary for information superiority. Basically, whoever has more data, just a sheer volume of data is poised to win. This immediately puts data privacy, digital rights and data sovereignty at the centre stage. It also ultimately determines where exactly a given data set is stored and a given workload runs.
Power costs are the next important issue. In addition to improvements in ‘results/watt’, we also see a move to renewable sources of energy and carbon neutral facilities.