Skip to main content
Version: 5.0.0

Environment Check

Before you start Kylin, we provide an environment dependency checking tool to help you spot the potential problems in advance. This tool will be automatically executed by startup script when you run Kylin at the first time.

How To Use

As said above, if you start Kylin at the first time, the startup script will automatically run this tool. If it check failed, this tool will be executed again when you start this product. Once successfully passed this check, the tool will not be executed automatically.

If you need to check the environment dependency manually, just run the below command:

$KYLIN_HOME/bin/check-env.sh

What To Check

The following table describes what will be checked in the tool.

Check ItemDescription
KerberosTo check whether user enable Kerberos in the settings. If not, the check will be skipped. Otherwise, it will execute the following operations:
1. check if Kerberos command exists
2. initialize Kerberos
OS version and commandKylin only supports Linux operating systems. Besides operating system, this tool will also check if hadoop and yarn commands exist. If these two commands are not available, please make sure Hadoop cluster whether is available.
Hadoop configuration filesKylin copies Hadoop configuration files to Kylin installation directory $KYLIN_HOME/hadoop_conf. For instance, core-site.xml, hdfs-site.xml, yarn-site.xml, hive-site.xml, etc. This tool will check if $KYLIN_HOME/hadoop_conf exists and contains necessary configuration files.
HDFS working directory1. Check if HDFS working directory exists
2. If yes, check whether current user has write privilege
Java versionCurrently, we only support Java versions above 1.8
Server portCheck if the port is in use
Spark1. Check if the configured resource size exceeds the cluster's actual resource size, such as, executor cores and executor instances.
2. Check if Spark is available
3. Check if the configured yarn queues for submitting query jobs and build jobs are legal 4. Check if the configured driver host address is legal
Spark log directoryUsers can configure a HDFS directory to store Spark logs, so it checks if the directory exists and current user has read and write privileges.
MetastoreCheck if the metastore is accessible and current user can perform necessary operations on metadata.
InfluxDB1. Check if InfluxDB is accessible
2. Check if current user has read and write privileges
ZooKeeperCheck if the service discovery is available.
KylinConfigChecking kylin config, must starts with kylin / spring / server.
Query historyCheck whether the current user has permissions of reading and writing on the query_history and query_history_realization tables in the RDBMS database