Big Data

2016: the tipping point for 3rd platform, says IDC

IDC’s Third Platform –the next computing generation—rests on cloud computing, big data and analytics, social business, and mobility. Together, these form a foundation to provide scalable anywhere-anytime-any device computing. As these trends become ubiquitous, they enable and accelerate the Internet of Things (IOT), cognitive systems, robotics, 3D printing, virtual reality, self-driving cars, or better security.

At the same time, this brave new world wreaks havoc on the old one of PC’s, client-server software and legacy apps. I would also add another disruptive ingredient to the mix—open source software, which is no longer for hobbyists and is now embedded in most new applications. IDC predicts that 2016 is the year in which spending on third platform IT will exceed that for the second platform, with a CAGR of 12.7% for 2015-2020. At the same time, they predict that second platform investment will be down 5%.

Their recent surveys show that, in terms of maturity, today most companies are in the digital exploration or early platform development phase, with 14% having no interest in digital transformation, and only 8% already using digital transformation to disrupt competitors or markets. That will change by 2020 as 50% of businesses will be using this platform to disrupt and transform.

Other predictions:

  • Business owners, not IT will control more of the IT budget
  • Health services and financial services are two of the top industries to invest, reaping the rewards of faster, cheaper, and more comprehensive uses of their data.
  • Other top applications now in the works include marketing and sales, retail, security, education, media and entertainment.
  • Technology will be embedded in most applications and devices.
  • Start-ups are rife, and the shakeup has not yet begun
  • Cognitive computing and AI is a requirement for developer teams—by 2018, more than 50% of developer teams will be using AI for continuous sensing and collective learning (cognitive applications and IOT).

Where does existing IT infrastructure play in this game? In our scramble as analysts to pin down trends, we often neglect the fact that existing systems and applications a still valuable. They may well be good enough for a given task or process, or they may continue to churn on, feeding into newer layers of technology stacks when appropriate. Unlike newer versions, the kinks have been worked out. The challenge for business and IT managers will be to distinguish between the promise of the new and the security of the old: when to invest, when to explore, and when to stand back and watch. Good question!

Click here or more information on IDC’s take on the 3rd Platform

IBM’s Watson Expands its Toolbox: Acquires AlchemyAPI

With its acquisition last week of AlchemyAPI, IBM’s Watson Group added new, tools and expertise to its already-rich and growing array. Alchemy API’s technology complements and expands the core IBM Watson features. It collects and organizes information with little preparation, making it a quick on-ramp for building a collection of information that is sorted and searchable. It works across subject domains, and doesn’t require the domain expertise that the original Watson required. Its unsupervised deep learning architecture is designed to extract order from large collections of information, including text and images, across domains.

In contrast, the original Watson tools used to understand, organize and analyze information demands some subject expertise. For best results, experts are required to build ontologies and rules for extracting facts, relationships and entities from text. The result is a mind-boggling capability to hypothesize, answer questions, and find relationships, but it takes time to build and is specific to a particular domain. That is both good and bad, because they provide a depth of understanding, but at a significant cost in terms of time to get up and running. The Watson tools are also text-centered, although significant strides have been made to add structured information as well as images and other forms of rich media.

AlchemyAPI was designed to solve precisely these problems. It creates a graph of entities – and the relationships among them, with no prior expectations for how this graph will be structured. It is entirely dependent on what information is in the collection. Again, this is both good and bad. Without subject expertise, topics that are not strongly represented in the collection may be missing or get short shrift. Both approaches have their limits, as well as their advantages. Experts add a level of topic understanding—of expectations—of what might be required to round out a topic. Machines don’t. But machines often uncover relationships, causes and effects, or correlations that humans might not expect. Finding surprises is one of the strongest arguments for investing in big data and cognitive computing.

In this acquisition, Watson continues the path that helped it win Jeopardy!—by combining every possible tool and approach that might increase understanding. IBM can now incorporate multiple categorizers, multiple schemas, multiple sources, and multiple views and then compare the results by the strength of their evidence. This gives us more varied and rich results since each technology contributes something new and crucial. Like the best human analysts, the system collects evidence, sorts through it, weighs it, and comes to more nuanced conclusions.

The Watson platform adds a major piece to information systems that is often unsung. It orchestrates the contributions of the technologies so that they support, balance and inform each other. It feeds back answers, errors, and user interactions to the system so that Watson learns and evolves, as a human would. In this, it removes some of the maddening stodginess of traditional search systems that give us the same answers no matter what we have learned. In seeking answers to complex, human problems, we need to find right answers, perhaps some wrong answers to sharpen our understanding, and certainly the surprises that lurk within large collections. We want a system that evolves and learns, not one that rests on the laurels of a static, often outdated ontology.

Mirroring this technology architecture, the IBM’s Watson Group similarly requires a group of closely knit, strong minded people who are experts in their separate areas of language understanding, system architecture, voting algorithms, user interaction, probability, logic, game theory, etc. Alchemy contributes its staff of deep learning experts, who are expected to join the Watson Group. It also brings its 40,000 developers worldwide, who will broaden the reach and speed the adoption of cognitive computing.