What’s new, what’s coming, and what’s missing, a conference recap from Tableau Conference 2017.

Tableau Software is the Apple of the analytics market, with a huge fan base and enthusiastic customers who are willing to stand in long lines for a glimpse at what’s next. Last week’s Tableau Conference in Las Vegas proved that once again with record attendance of more than 14,000.

The Tableau fan boys and fan girls were not disappointed, as the company detailed plenty of new capabilities. The highly anticipated Hyper engine, for example, is now in beta release 10.5 and is sure to be generally available by early next year. Hyper solves Tableau performance problems when dealing with high-scale data extracts. The columnar, in-memory technology speeds the creation of data extracts, makes it possible to deal with larger-scale extracts and better supports scalability for enterprise-scale deployments.

Also in 10.5 are a slew of upgrades including nested projects, for more granular administrative control, mapping and web authoring improvements, and a “Viz in Tooltip” feature that provides deeper, sparkline visualizations when you hover over a data point. A new Extensions API will enable developers to bring third-party application functionality into Tableau. For example, natural-language interpretations from Automated Insights can be embedded into Tableau Dashboards to help explain the data visualizations. Or users looking for data sources to explore could be exposed to suggestions from Alation, the third-party data catalog.


For now, Tableau’s new Hyper engine meets scale and performance demands tied to handling structured data extracts. In the future, it will address NoSQL and graph workloads.

A bit farther over the horizon, Tableau offered a preview of its Project Maestro self-service data-prep option. Tableau executives said there would still be a place for the deeper self-service data-prep functionality offered by partners, but it looks like Maestro will deliver intuitive, visual tools that will enable many business users to combine, clean, and transform data. (Thus, data-prep partners like Alteryx and Trifacta are moving to provide more advanced capabilities, such as prediction and machine learning.)

Maestro is expected to be in beta release by year end. General availability typically follows beta release within a quarter, but Tableau execs weren’t ready to discuss packaging or pricing of what will be an optional module that’s integrated with, but separate from, Tableau Desktop and Tableau Server.

Even farther over the horizon, Tableau outlined plans for more “smart” capabilities powered by machine learning and natural language query. Tableau already recommends data sources based on historical behavior by user, group, role, and access privileges, but more discovery and analysis recommendations are in the works. Having recently acquired ClearGraph, Tableau is also working on natural language query capabilities that will enable users to have more of a dialogue with the software. Using a technique called query pragmatics, ClearGraph’s technology can retain the context of a previous query to drill down to deeper insight. The queries can be typed in or, with third-party voice-to-text capabilities, spoken into mobile devices.

By Doug Henschen

Source : ZDNET