Chapter 7. Advanced Connector Examples

Now you know what functionality connectors provide to Trino and how to configure them from Chapter 6. Let’s expand that knowledge to some of the more complex usage scenarios and connectors. These are typically connectors that need to be smart enough to translate storage patterns and ideas from the underlying data source, which do not easily map to the table-oriented model from SQL and Trino.

Learn more by jumping right to the section about the system you want to connect to with Trino and query with SQL:

After these connectors, you can round out your understanding by learning about query federation and the related ETL usage in “Query Federation in Trino”.

Connecting to HBase with Phoenix

The distributed, scalable, big data store Apache HBase builds on top of HDFS. Users are, however, not restricted to use low-level HDFS and access it with the Hive connector. The Apache Phoenix project provides a SQL layer to access HBase, and thanks to the Trino Phoenix connector, you can therefore access HBase databases from Trino just like any other data source.

As usual, you simply need a catalog file like etc/catalog/bigtables.properties:

connector.name=phoenix
phoenix.connection-url=jdbc:phoenix:zookeeper1,zookeeper2:2181:/hbase

The connection ...

Get Trino: The Definitive Guide now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.