Printer Friendly

Game-Changing Technologies for Today's Data Scene.

While change has always been a part of the database credo, the growing emphasis on data-driven decision making in today's economy has resulted in a dizzying plethora of technologies and methodologies entering the market. The number and scope of game-changing technologies are too numerous to mention, and one thing is certain: Database management will never be the same. We have identified some of the most promising technology initiatives, based on discussions with and input from data experts from across the industry spectrum, gathering their views on the key technologies--well-known or under the radar--that are worth watching.


"Streaming data has changed the way companies ingest information and has had an incredible impact on how businesses manage their big data opportunities," said Bobby Johnson, CTO and co-founder of Interana, and former director of engineering at Facebook. "Enterprises with the biggest competitive advantage empower both data analysts and business users to go beyond aggregate data with the ability to ask iterative questions at a granular level and in real time to fully understand their customers and their business."

Emerging or widespread? "Leveraging streaming data innovations is conceptually very established, but the adoption is emerging if you look at usage in Fortune 500 companies," said Johnson. "While the actual big pipe streaming of this data is becoming increasingly mainstream, the parts that are closer to cutting-edge involve what you do after you collect this data, such as behavioral analytics."

Potential challenges: To support streaming data, visibility and speed of insights are key, "because you must have that clear visibility in seconds--not days or weeks--to understand the diversity of your customer journey and ever-changing market conditions," Johnson said. "Without visibility, you may never see the data's richness and depth. The challenge here is that it can be more complicated, but the payoff is that it's really rich."

Future prospects: The future of streaming data is robust. "The need for streaming massive amounts of data from every angle of your business is critical for your business insights and planning," said Johnson. "In 5 years, every decision maker will have clear, reliable data that one can question in real time. Everything you care about-even something as simple as a light switch--is going to be generating information and it's going to end up in a pipeline, so it's critical to have access to interactive, real-time visibility of these things."


For today's market-savvy enterprises, automated customer data platforms (CDPs) unlock the value in their customers' data, said Abhi Yadav, CEO of Zylotech. "Automated CDPs solve a variety of data challenges for marketing and sales operations staff members who can miss upsell/cross-sell opportunities simply because their data is not working to their advantage. Their benefit is in consolidating customer data from a variety of siloed sources into one platform that significantly increases the level of insights available to users. Driven by AI, automated CDPs unify the demographic, behavioral, and transactional data of customer records, creating complete views of customer data."

Widespread or emerging? While it's a newer generation tool, "CDPs are increasingly seeing widespread use, according to a recent Forbes Insights survey," said Yadav.

Potential challenges: "The primary roadblock for the adoption of a CDP remains the chaos or noise within the industry," said Yadav. "As the CDP category has gained momentum in the past few years, many vendors which touch any form of customer data have now begun positioning themselves as CDPs. This makes it difficult for enterprises to quickly understand the relative strengths and weaknesses of one platform versus the other, which in turn makes it more difficult to choose a partner to form a solution around a business problem."

Future prospects: "Today, only some automated CDPs are enhanced with AutoML (automated machine learning), allowing business operations teams to have self-learning access to their data," said Yadav. "As these operations teams need a more complete view of customer data in real time to keep up with customer demands, all CDP users will require AutoML as a feature. This will not only make CDPs easy to use for non-technical users but it will give them a powerful capability to stay competitive."


With inefficient B2B, application, and data integrations with partners, suppliers, and customers, "ecosystem integration software is one technology area where enterprises can significantly differentiate themselves from their competitors," said Tushar Patel, CMO of Cleo. "Most companies are not keeping up because they are still approaching B2B and application integrations separately. Ecosystem integration software allows enterprises to orchestrate their end-to-end business workflows by taking an outside-in view--creating opportunities to optimize critical business processes."

Widespread or emerging? "Ecosystem integration solutions are closer to widespread use than experimental," said Patel. "Many organizations across a wide range of industries--including supply chain/logistics, manufacturing, retail, ecommerce, and software/information services, to name a few--are utilizing or embedding ecosystem integration platforms to deliver invaluable insights from their data."

Potential challenges: "Organizations must break from the past, which involved black-box integration solutions or toolkits that required significant technical depth," said Patel. "They must stop integrating for the sake of integrating and think about the desired business outcome of their integration--think through the desired end state and build from there."

Future prospects: "Ecosystem integration solutions will likely be firmly mainstream within 5 years' time because those organizations that don't adopt an ecosystem approach to integration are at a high risk of becoming obsolete," Patel said. "Due to the fact that ecosystem integration solutions truly do enable and activate entire ecosystems, we can expect these dynamic solutions to be a core part of not only any enterprise's technology stack but also their overall business strategy."


Cloud computing has been on the scene for a number of years now, but it is still as seen as a disruptive technology. "Cloud-based computing platforms and services are empowering enterprises to better optimize computing and storage power to manage data," said Jai Ganesh, SVP and head of Mphasis NEXT Labs. "Cloudbased computing alleviates the stress of dealing with archaic processes and enables businesses to easily manage data, scale up or down when needed, and do away with costly data centers."

Widespread or emerging? "Cloud technology has been building momentum over the past decade but still has more room to disrupt various industries as we know it," said Ganesh. "Most enterprise workloads will be in the cloud within the next year."

Potential challenges: Of course, with any burgeoning technology, there will be areas where the technology needs to improve. "From an engineering perspective, the current models of engineering may not be appropriate or sufficient to exploit these new technologies and solve the new global problems," Ganesh said.

Future prospects: "We will witness accelerated adoption of cloud computing across industries and government," Ganesh predicted. "We are not far from the days where even quantum computing will be on the cloud on a pay-per-use basis."


As can be expected, AI and machine learning topped experts' lists of cutting-edge technologies that are reshaping the data center.

"Enterprises recognize the strategic value of deploying artificial intelligence for competitive advantage. However, studies have shown that the adoption rate of AI is only at 4%," said Dinesh Nirmal, VP of data and AI development for IBM. "The disparity is largely the result of increasing data complexity and silos--for instance, an organization that stores its data across multiple cloud environments."

Emerging or widespread? AI technology is still in emerging or experimental stages, said Nirmal. "The ability to access AI across multiple clouds is still relatively new," he added, noting that in February, IBM started offering Watson Anywhere, which cuts across cloud environments.

Potential challenges: Data science and machine learning talent remain in short supply, said Helena Schwenk, global analyst relations manager at Exasol. "The skills gap is still a significant barrier that organizations struggle to overcome. This complicates the most time-intensive and complex task--the data preparation stage of a machine-learning project. The output of a machine-learning model is only as smart as the people who train it." The other big barrier to machine learning's success is operationalization, where models don't make it into production environments to be used against live business data, Schwenk continued. "Industry research indicates that around 50% of data science projects don't make it into production." Schwenk advised, "setting up a strong governance framework, building the right data platform, developing a clear strategy to build machine learning skills."

Future prospects: AI has a bright future, Nirmal said. "We expect the trend toward running AI on the cloud to continue, due to the efficiencies, scalability, and lower costs associated with cloud. Running AI on the cloud speeds business digital transformation, helps unlock hidden insights from company-wide data, and helps them begin to automate business processes that improve efficiencies and increase performance."


"Digital threads" will help keep companies--especially manufacturers with software embedded into products--more agile. "The digital thread is the framework which connects data and produces a holistic view of an asset's data across its product lifecycle," said Mark Reisig, director of product marketing at Aras. "With access to hosts of new data from a product's performance in the field, manufacturers can determine what adjustments need to be made to future iterations of that product, as well as future new products, to stay relevant in a rapidly changing, competitive marketplace."

Emerging or widespread? At this time, "organizations are still in the midst of fully adopting a true digital thread solution," Reisig said.

Potential challenges: "Legacy technology hampers digital transformation efforts," Reisig said.

"Legacy systems consist of a mix of individual products accumulated over the years, sometimes through acquisitions. At best, they may have been loosely stitched together but more often, the result is silos within the product lifecycle management system architecture that impede collaboration," he noted.

Future prospects: In today's business climate, innovation is critical. Companies that are still focused on doing things the way they did yesterday "are in danger of being boxed out of the market by competitors and new entrants that are simply moving faster than they are," said Reisig. "If properly deployed, the digital thread can support your organization's needs next month--and next year--to stay flexible and leverage the data you have at your fingertips today to ensure you can compete tomorrow."


Virtual databases have been on the scene for a number of years, but lately have risen to a new level around test data management using virtualization. "There is amazing tech that looks at the block-storage level of the database to provide virtual databases very quickly and via self-service," said Robert Reeves, chief technology officer of Datical. "The virtual databases provided can have masked data to protect personally identifiable information. This is a game changer for large enterprises that need to develop faster yet have legacy databases. With virtual databases, developers and test teams can create ephemeral environments that mimic production systems. Thus, they can shift their system integration efforts to the left and improve delivery time and quality."

Emerging or widespread? "Both virtual databases and DevOps [are] very widespread in use by many enterprise-level customers," Reeves said.

Potential challenges: "DBAs have challenges with this technology as it automates a previously manual task," said Reeves. "Thus, some organizations have problems adopting due to internal foot dragging caused by DBAs scared about their job security. What happens is that it allows DBAs to perform higher-value tasks and become more valuable to the enterprise."

Future prospects: "Currently, this technology can be applied to NoSQL databases but is not widely adopted for that use case," Reeves said. "However, with the expansion of those data stores in the enterprise, you will see virtual databases being used for both RDBMS and NoSQL."


AIOps represents the next level of maturity for ITOps, and will include capabilities such as advanced analytics, machine learning, and AI, said Douglas McDowell, chief strategy officer at SentryOne.

Emerging or widespread? "AIOps is rapidly emerging," McDowell said. "In fact, ITOps software developers--and their customers--that have not adopted AIOps at some level will be greatly limited or obsolete in the next 5 years. With the growth in the number of connected technologies, and their complexities, especially in distributed, microservice-based cloud architectures, it will be increasingly difficult for humans to efficiently and effectively monitor, diagnose, and optimize technical platforms, databases, and apps. AI-based automation and recommendations bring hope to DevOps engineers and admins, to allow for controlled development, testing, and operational ownership."

Potential challenges: "AIOps is still a bit of the 'Wild West.' Vendors are over-promising and under-delivering," said McDowell. "Customers must assess their cost of failure. Where the cost of failure is high, customers should look to AIOps for recommendations. Where the cost of failure is low, auto-tuning can be used. AIOps ISVs, to be successful, must limit their scope rather than attempting to make predictions and recommendations with an endless number of variables and without regard for customers cost of failure."

Future prospects: "AIOps will be table stakes in less than 5 years," McDowell predicted. "All monitoring and Ops software [still in existence] will have a degree of AIOps capabilities."


Big data, along with the tools to process and analyze structured data through business intelligence applications, non-relational databases, and ETL platforms, is mainstream at this point, said Chad Steelberg, chairman and CEO of Veritone. The next area of focus is what Steelberg referred to as "big content." This, he said, represents five times more data than big data alone and includes unstructured information, comprised of mostly audio and video content. "The explosion of this data, whether from medical devices or the proliferation of IP cameras, is filling the cloud at a rate of nearly 1 zettabyte per year and growing exponentially."

Emerging or widespread? Big content technology is still in its emerging phase, Steelberg pointed out, "with less than 5% of the use cases defined. Much of the cognition required for the machine to understand big content and unlock 100% of its value is still emerging and in its experimental stages."

Potential challenges: Big content increases reliance on AI, which "will be the key to our survival, not the cause of our demise," Steelberg said. "Humans are going to have to learn to trust the wisdom of the machine."

Future prospects: "Cognitive services powered by AI and new databases designed to store, retrieve, and understand this unstructured content at superhuman levels is the future," said Steelberg. "In 5 years, AI will have reached a point where machines can see, hear, and understand the world at superhuman levels. Our unstructured society will be covered in sensors, feeding these intelligent services with streams of raw data."


AutoML will have the greatest impact on enterprises' ability to process and manage data, predicted Alex Ough, CTO architect at Sungard Availability Services. "AutoML automatically selects the best algorithm and model for the intended purpose, enabling individuals with limited ML knowledge to successfully build and apply ML technologies. All you have to do is upload your labeled data, then AutoML will find the appropriate algorithm and return the best results."

Emerging or widespread? "AutoML is still in the emerging stages," said Ough. "Google's AutoML is currently in beta, and another similar open source technology called Auto-Keras is in pre-release stages."

Potential challenges: "Data silos can present major challenges for AutoML technology," said Ough. "To reap the benefits of AutoML, you must have a unified source of accurate data across the enterprise and find the truth of source."

Future prospects: "I foresee this technology being used across all industries in 5 years," Ough predicted.

By Joe McKendrick
COPYRIGHT 2019 Information Today, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2019 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:McKendrick, Joe
Publication:Database Trends & Applications
Date:Jun 1, 2019
Previous Article:Real Time Begins to Dominate Data-Driven Agendas.
Next Article:The Companies That Matter Most in Data.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters