Here’s What 6 Smart People Say Enterprise Technology Managers Should Pay Attention to in 2016
Want a prediction that’s a lock for this, and every, year? The pace of innovation in advanced scale computing will accelerate. The spread of HPC into the commercial realm is bringing about profound technological change, much of that change focused on easing implementation and providing the enhanced reliability, resilience and quality of service that businesses require, change that will continue this year. We asked leaders from six technology companies to share their prognostications on the major trends driving HPC in the enterprise for 2016.
- HPC and the Enterprise will Cross-pollinate
Intel’s Patrick Buddenbaum, Director of the Enterprise and Workstation Platform Group, envisions a year in which companies increasingly adopt not only HPC technology but also HPC techniques. “Enterprise software development methods,” he said, will move “to multi-level parallel applications developed in interpreted languages coupled with compiled languages, with stronger reliance on open-source, community development-based collaborations.”
He also sees increasing application of modeling, simulation and visualization solutions to enterprise analytic tools, such as big data analytics and interactive visualization, driving data analysis growth beyond the capabilities of today’s relatively simple data presentations, such as Excel bar charts.
Driving this will be the convergence of frameworks, through software defined infrastructures, of the HPC (MPI, SLURM, scheduler) and the big data analytics stacks and equivalent services (Spark, YARN). The result, Buddenbaum said, will be “enterprises balancing the compute and software development model that is common in the HPC environment (more grid-like, more tightly coupled multi-level parallelism and designed for mass execution of tasks and assembly) with the traditional enterprise environment (more serial, spanning large, virtualized compute pools).”
- Don’t Laugh: IoT Will Finally Take Off
Mark Barrenechea, CEO of OpenText, a big data EIM (enterprise information management) company, foresees 2016 as a year in which the IoT becomes a rapidly emerging reality.
“The IoT will cause massive disruption through better automation, integration, and communication,” Barrenechea said. “Insurance companies are deploying sensors and software to monitor how drivers behave and generate risk profiles using big data analytics that accurately align to or construct on-demand products to suit individual behavior. Thermostats communicate with residents and accumulate behavioral data to formulate the most energy efficient and comfortable schedules and settings. Software agents move money, stocks, goods, and people around the world, routing, optimizing, and transacting innumerable times a year—and these are just three examples already in enterprise use today. They will quickly evolve and proliferate into 2016.”
Barrenechea cited studies predicting that within a few years, 25 billion devices (according to Gartner Group) will be generating data, carrying with it an economic impact of between $4 and $11 trillion (McKinsey & Company’s Global Institute). He envisions an IoT-driven future in which “perhaps we will even progress as a society to a post-scarcity economy and information itself will become our commodity of trade. Monetizing the exchange of information, micro-licensing, and transactions become prominent tasks as our automation and machine-to-machine networks take care of daily needs.”
While the permanent revolution in technological innovation delivers tremendous advantages, it’s also the case, according to a 2015 CEO survey by PriceWaterhouseCoopers, that nearly 60 percent of CEOs consider digital disruption to be a challenge to their businesses. The key, Barrenechea said, is leadership.
“Over the next five years, CEOs will lead by example, adopting a ‘Digital Mindset’ driven by disruption, immediacy, and scale with centricity on journeys, experience and a real-time-ness,” he said. “Just like we have an IQ and EQ, organizations need to develop a DQ, a ‘digital quotient,’ where strategy, culture, people, and capabilities converge. The CEO will lead this charge.”
- 2016 Forecast: Cloudier
Cloud computing will be central to this change, and John Engates, CTO at cloud solutions provider Rackspace, said 2015 saw the emergence of “a multi-cloud world where value in cloud meant more than lowest cost; where companies would begin creating the exact mix of cloud infrastructure they need for their workloads.”
Engates also expects Security-as-a-Service to continue to develop apace in 2016. While he acknowledges that security worries are the top concern among CIOs considering migration to the cloud, he also observed that security is not a core competency for most companies, leaving their on-premises infrastructures vulnerable. “In addition to economies of scale, cloud companies offering security-as-a-service also offer economies of expertise,” he said. He said cloud providers like Rackspace deal with attacks every day and have learned to deal with security threats on both a reactive and proactive basis, citing a 2014 report by the Privacy Rights Clearinghouse showing that only 10 percent of data breaches took place in the cloud.
Engates said concerns about vendor lock-in will drive increasing acceptance by the enterprise of open source software.
“Choosing the wrong technology or provider ranks up there with security as one of the biggest worries keeping CIOs up at night,” Engates said. And that’s why I predict open source will continue to play a critical role in cloud growth. Last year I described OpenStack, at age five, as boring, and explained why that was good. Boring means stable and that stable foundation will allow enterprises in 2016 to more fully embrace open source cloud solutions and make them part of companies’ overall cloud strategies.”
- The NoSQL Takeover
Dan Kogan, director of product marketing at data visualization software company Tableau, said the “NoSQL Takeover,” commonly associated with unstructured data, will be one of the major big data trends in 2016 as the benefits of schema-less database concepts become more pronounced. He also said Apache Spark has moved from a being a component of the Hadoop ecosystem to the big data platform of choice for a growing number of enterprises because of its superior data processing speed.
“We see more and more compelling enterprise use cases around Spark,” Kogan said, “such as at Goldman Sachs where Spark has become the ‘lingua franca’ of big data analytics.”
By the same token, Kogan said, Hadoop projects continue to mature, moving from proof-of-concept to production stage. He cited a recent survey of 2,200 Hadoop customers in which less than 5 percent anticipate they will reduce their use of Hadoop in the next 12 months; three-quarters of Hadoop users plan to do more within the next three months and almost half of companies that haven’t deployed Hadoop say they will within the next 12 months.
As Hadoop increasingly becomes a core aspect of the enterprise IT landscape, Kogan expects growing investment in Hadoop support components, such as security.
“The Apache Sentry project provides a system for enforcing fine-grained, role-based authorization to data and metadata stored on a Hadoop cluster,” he said. “These are the types of capabilities that customers expect from their enterprise-grade RDBMS platforms and are now coming to the forefront of the emerging big data technologies, thus eliminating one more barrier to enterprise adoption.”
Accompanying this will be growing demand for enhanced Hadoop data exploration capabilities from such providers as Cloudera Impala, AtScale, Actian Vector and Jethro Data “that enable the business user’s old friend, the OLAP cube, for Hadoop – further blurring the lines behind the ‘traditional’ BI concepts and the world of ‘Big Data’.” Kogan also said self-service data preparation tools will continue explode in popularity.
Another trend Kogan foresees gaining in 2016 is a major shift in on-demand data warehousing technology to the cloud, where Amazon led the way with Redshift and now has competition from Google (BigQuery) and Microsoft (Azure SQL Data Warehouse) and Teradata, along with start-ups such as Snowflake. Kogan cited studies showing that most companies using Hadoop will also keep their data warehouses.
“With these new cloud offerings, those customers can dynamically scale up or down the amount of storage and compute resources in the data warehouse relative to the larger amounts of information stored in their Hadoop data lake,” he said, adding that with the IoT driving a petabyte-scale data explosion, the leading cloud and data companies will introduce IoT services that enable data to move seamlessly to their cloud-based analytics engines.
“Though these changes and trends may seem disparate,” Kogan said, “they’re all linked by the need to work with data quickly and conveniently. As big data changes and new ways of working with that data pop up, the details shift, but the song remains the same: everyone’s a data analyst, and there’s never been a more exciting job.”
- Data (In)Security
On the data security front, Rohit Gupta, CEO of Palerra, said IoT-driven data will lead to new varieties of breach threats, such as attacks on computers in cars that cause massive road incidents, on PHI data from systems monitoring patient medical devices and on information about homeowners’ electrical and water usage.
To combat this, Gupta predicts that collaboration between threat detection vendors will grow beyond sharing of virus signatures to include IoC’s (indicators of compromises).
“As IoCs continue to gain importance,” Gupta said, “the vendors will start to collaborate and pool their resources together to offer the quickest and most accurate way to detect compromises.”
- A Break-out Year for Containers
Finally, we have predictions related to the rapidly evolving world of containers, which Luke Marsden, CTO of container data management vendor ClusterHQ, asserts will drive the next wave of public cloud adoption – even by large, highly regulated industries.
“While it might not seem like containers (the supposedly less-secure virtualization alternative) will push the most conservative enterprises towards public cloud,” Marsden said, “that is exactly what is happening.”
While Netflix was the AWS customer that most typified the first wave of cloud adoption (high scale, high complexity), he said, Capital One, one of the U.S.’s largest banks, typifies the second cloud revolution. “The benefits in speed and agility for containers are so great that large, established and heavily regulated industries will turn to containers in droves as a way to innovate in the face of stiffer competition.”
Key to enabling this will be the adoption of standards.
“A range of standards bodies have emerged recently to deal with the issues around interoperability in container environments, from the Open Container Initiative to the Cloud Native Computing Foundation,” Marsden said. “This trend will continue with standards emerging at all levels of the container stack, from runtime and format, to security, to networking and storage. As a community driven by technical merit — not marketing budgets — container projects that get real developer traction will be quickly adopted as the standard in the fast moving container ecosystem.
Having said that, Marsden also believes companies won’t begin making large investments in container technology until 2017.
“This coming year will be a time when Docker progresses from primarily the domain of DevOps and becomes more prevalent in a greater variety of enterprise use cases. However, even with the front office starting to pay closer attention to the benefits of containers from a business perspective, it will be at least another year before million dollar contracts with container companies becomes a regular occurrence.”