Pivotal executive and proud tech geek Chad Sakac gave the keynote at the Dell Technologies Forum in Toronto on Tuesday, deconstructing the present process of technological change, and counselling customers on a strategy of how to address scary change that transcended the Dell companies, which involves embracing change in their companies and providing ways to encourage it from the bottom up.
TORONTO — It’s become something of a cliché at technology events to hear the refrain that the day in your life that will have the least amount of change is today, because the pace of everything will continue to accelerate. Dell Technologies Forum, the event here for the company’s Canadian customers, was certainly no exception to that. However in the event keynote, Chad Sakac, Pivotal’s SVP Strategic Alliances, laid out a model that Dell sees as the defining technological features of this turbulent era, and explained how customers need to work within it to respond in a positive manner that will make their companies winners rather than victims.
“It’s time for fearless, optimistic change,” Sakac said. “What we are trying to do is jar everyone a bit, with ideas that are bigger and broader than Dell Technologies as a company. This is the Dell Technologies corporate view, echoed through my own lens and experiences. This is a scary new era because our industry is changing more quickly than ever, and because we aren’t just augmenting our muscles, but our minds – the things that define us.”
Sakac described the mindsets he typically encounters when he talks with customers.
“Customers are asking us for two things that are mutually opposite,” he said. “On the one end, they want increased options and choices, and on the other end, they also want outcomes – the full stack.”
They also often don’t grasp the full potential and implications of technology. He said that Microsoft paid a lot of money for Github, which hosts open source projects, because CEO Satya Nadella is a very smart guy.
“They think about Github not for what it is, but what it represents,” he said. “It lets you see what projects are getting pulls and which are not. When I meet customers, they don’t realize this.”
Sakac also emphasized two elements needed to deal with this change and properly leverage technology – human machine partnerships, and change originating from within an organization, rather than the top.
“When you think about how to unlock the value of your data, you must realize that data sets now are impossible for a human to reason over,” he said. “They are too large. It makes human-machine partnerships incumbent.”
The second idea is that the speed of iteration is the most important parameter for getting to good, and that it is much more likely to come from below.
“I’ve rarely seen anything good come from a top-down corporate initiative,” Sakac told the keynote audience. “It’s actually the small initiatives that typically turn into something big, not the mega-projects that require massive marshalling of resources. Now, you will not have success without C-level support. But if anyone is waiting for a corporate initiative to be the agent of change – that’s not how it works. Change originates with individuals.”
In understanding the change that is taking place, and enabling strategies to deal with it, Sakac said customers need to understand the interrelationship of five core principles: the Internet of Things; artificial intelligence; augmented reality/virtual reality [AR/VR]; hybrid cloud models, and software-defined everything.
Sakac said while the Internet of Things has become a cliché which the media has oversimplified, it is of immense importance.
“The IoT is not a thing, it’s a concept, and its very real,” he said. “We tend to identify it with industrial use cases but it’s in everything that we do. Krogers – the American Loblaws – has devices that smell goods, and track information on the detailed health of goods. It seems gimmicky, but it’s valuable information for the client. It provides business value. So how do we architect for continuously better outcomes and also de-risk them? We see IoT as about architecting a system that embraces the reality that IoT is at the Edge, and at the distributed core, and in the cloud. It’s not just one. Even with Moore’s Law slowing down, a device that costs under a dollar will have more than a trillion transistor connections, and we touch this at every single level. De-risking the journey involves curation and special use blueprints with partners.”
Sakac said the nature and impact of artificial intelligence is similar.
“AI is also all around us, and we need to have a posture to embrace this idea,” he said. “There is a tendency in the media to focus on the risk to careers from AI, and whether AI will disrupt jobs. Of course it will! And it will create many more new jobs. The AI Engine is innovation within compute, with the result that general purpose CPUs are used less and less. We are seeing more GPUs, and TensorFlow using mathematical models, and we are investing in investing in silicon CPU and GPU startups. We think Dell Technologies has the ability to deal with this because we have a proven ability to disrupt ourselves, with disruptions such as those around software-defined storage and vSAN.”
Sakac said that the third area, AR/VR, is still in its infancy.
“This is immersive and collaborative computing,” he said. “We are in an era of machine and human partnership, and human beings interact with the world through our senses. So changing the sensor experience of how people experience the world will be an essential part of human- machine partnerships. They augment the human experience, in the same way that AI and machine learning augment human capability.”
The final two concepts – multi-cloud models and software-defined everything – underpin the other two.
“They are essential,” Sakac said. “The compute and the data need to live somewhere, and where it lives is not as important as what it does. Increasingly, the common pattern for handling these is the horizontal control planes that defy commonality – from VM level, to the container, to developer platforms, and even higher in the stack is developer frameworks like .Net. Those things that are the control planes that span on-prem and off-prem infrastructure represent the architectural model for how the workload models actually run. The hybrid cloud is an operating model for this. The world is not going all to the public cloud. It’s not a pragmatic reality. The majority of customers plan to use five or more cloud models. How do you architect for that? With common control planes that give you control and give you choice. The VMware vRealize managed cloud layer brings these together. Similarly, with what we are doing with Pivotal Cloud Foundry, customers can push code on an on-prem or off- prem stack, with any cloud you want or any application you want.”
Finally, the whole of the infrastructure model has to be programmable.
“That’s what software-defined means – being completely programable regardless of where it runs,” Sakac said. “You cannot deploy Kubernetes without a software-defined network. Every Kubernetes cluster has to have a software-defined ingress and egress routing. A Kubernetes cluster by itself can’t actually talk to the outside world, so it has limited use. You also need software-defined load balancers. If you want this to work, you need an operating model where everything is very programmable and developers and consumers can do it on demand, whether on-prem or in cloud. Dell Technologies is not perfect, but we have a lot of experience at constructing a hybrid cloud in the software-defined model.”