#TechFriday: Trends in technology for 2019

Atish Gude is the chief strategy officer at NetApp, a hybrid cloud data service and data management company.

Atish Gude is the chief strategy officer at NetApp, a hybrid cloud data service and data management company.

Published Jan 11, 2019

Share

As we enter 2019, what stands out is how trends in business and technology are connected by common themes. For example, artificial intelligence (AI) is at the heart of trends in development, data management and delivery of applications and services at the edge, core and cloud. Also essential are containerisation as a critical enabling technology and the increasing intelligence of Internet of Things (IoT) devices at the edge. Navigating the tempests of transformation are developers whose requirements drive the creation of new paradigms and technologies that they must then master in pursuit of long-term competitive advantage.

Atish Gude gives the main predictions on trends for 2019:

AI will get its early start mostly in the clouds

Still at an early stage of development, AI technologies will process massive amounts of data, most of which will happen in public clouds.

A rapidly growing body of AI

software and service tools - mostly in the cloud - will make AI development easier and easier. This will enable AI applications to deliver high performance and scalability, both on and off premises, and support multiple data access protocols and new data formats.

Accordingly, the infrastructure supporting AI workloads will be the to be fast, resilient, and automated. While AI will certainly become the next battleground for infrastructure vendors, most new development will be aimed at the cloud.

IoT: Don’t phone home. Figure it out

Edge devices will get smarter and more capable of making processing and application decisions in real time.

Traditional IoT devices have been built around an inherent “phone home” paradigm: collect data, send it for processing, wait for instructions.

But even with the advent of 5G networks, real-time decisions can’t wait for data to make the round trip to a cloud or data centre and back, plus the rate of data growth is increasing.

As a result, data processing will have to happen close to the consumer, and this will intensify the demand for more data processing capabilities at the edge. IoT devices and applications - with built-in services such as data analysis and data reduction - will get better, faster and smarter about deciding what data requires immediate action and what data gets sent home to the core or to the cloud.

Automagically, please

The demand for highly simplified IT services will drive continued abstraction of IT resources and the commoditisation of data services.

Hardly anyone’s spending weekends changing their own oil or spark plugs any more. You turn on the car, it runs. You don’t have to think about it until you get a message saying something needs attention.

The same expectations are developing for IT infrastructure, starting with storage and data management: developers don’t want to think about it, they just want it to work. “Automagically,” please. Especially with containerisation and “server-less” technologies, the trend towards abstraction of

individual systems and services will drive IT architects to design for data and data processing, and to build hybrid, multicloud data fabrics rather than just data centres. With the application of predictive technologies and diagnostics, decision-makers will rely more and more on extremely robust yet “invisible” data services that deliver data when and where it’s needed, wherever it lives.

Building for multicloud will be a choice

Hybrid multicloud will be the default IT architecture for most larger organisations, while others will choose the simplicity and consistency of a single cloud provider.

Containers will make workloads extremely portable. But data itself can be far less portable than compute and application resources, and that affects the portability of runtime environments. Even if you solve for data gravity, data consistency, data protection and data security, you can still face the problem of platform lock-in and cloud provider-specific services that you’re writing against, which are not portable across clouds.

As a result, smaller organisations will either develop in-house capabilities as an alternative to cloud service providers, or they’ll choose the

simplicity, optimisation and hands-off management that come from

buying into a single cloud provider. And you can count on service providers to develop new differentiators to reward those who choose lock-in.

On the other hand, larger organisations will demand the flexibility,

neutrality and cost-effectiveness of being able to move applications between clouds. They’ll leverage containers and data fabrics to break lock-in, to ensure total portability and to control their own destiny. Whatever path they choose, organisations of all sizes will need to develop practices to get the most out of their choice.

The container promise: really cool new stuff

Container-based cloud orchestration will enable true hybrid cloud application development.

Containers promise, among other things, freedom from vendor lock-in. While containerisation technologies like Docker will continue to have

relevance, the standard for multicloud application development will be Kubernetes. But here’s the cool stuff

New container-based cloud orchestration technologies will enable hybrid cloud application development, so new applications will be developed for public and on-premises use cases: no more porting applications back and forth. This will make it easier to move workloads to where data is being generated, rather than what has traditionally been the other way around.

Atish Gude is the chief strategy officer at NetApp, a hybrid cloud data service and data management company.

Related Topics: