Skip to main content
Version: Next(1.7.0)

Configurations

A public configuration file linkis.properties is provided in the conf directory to avoid common configuration parameters needing to be configured in multiple microservices at the same time. This document will list the parameters by module.

Please note: This article only gives all the configuration parameters of Linkis that affect the running performance or depend on the environment. Many configuration parameters that do not need to be cared about by the user have been omitted. If the user is interested, you can browse the source code to view it.

1 General Configuration

The general configuration can be set in the global linkis.properties. One setting can take effect for each microservice.

1.1 Global Configuration

parameter namedefault valuedescription
wds.linkis.encodingutf-8Linkis default encoding format
wds.linkis.date.patternyyyy-MM-dd'T'HH:mm:ssZdefault date format
wds.linkis.test.modefalseWhether to enable debugging mode, if set to true, all microservices support password-free login, and all EngineConn open remote debugging ports
wds.linkis.test.usernoneWhen wds.linkis.test.mode=true, the default login user for password-free login
wds.linkis.home/appcom/Install/LinkisInstallLinkis installation directory, if it does not exist, it will automatically get the value of LINKIS_HOME
wds.linkis.httpclient.default.connect.timeOut50000Default connection timeout of Linkis HttpClient

1.2 LDAP Configuration

parameter namedefault valuedescription
wds.linkis.ldap.proxy.urlNoneLDAP URL address
wds.linkis.ldap.proxy.baseDNNoneLDAP baseDN address
wds.linkis.ldap.proxy.userNameFormatNone

1.3 Hadoop Configuration

parameter namedefault valuedescription
wds.linkis.hadoop.root.userhadoopHDFS super user
wds.linkis.filesystem.hdfs.root.pathNoneUser's HDFS default root path
wds.linkis.keytab.enablefalsewhether to enable kerberos
wds.linkis.keytab.file/appcom/keytabkerberos keytab path, only valid when wds.linkis.keytab.enable=true
wds.linkis.keytab.host.enabledfalse
wds.linkis.keytab.host127.0.0.1
hadoop.config.dirNoneIf not configured, it will be read from the environment variable HADOOP_CONF_DIR
wds.linkis.hadoop.external.conf.dir.prefix/appcom/config/external-conf/hadoophadoop extra configuration

1.4 Linkis RPC Configuration

parameter namedefault valuedescription
wds.linkis.rpc.broadcast.thread.num10Number of Linkis RPC broadcast threads (recommended to use the default value)
wds.linkis.ms.rpc.sync.timeout60000The default processing timeout of Linkis RPC Receiver
wds.linkis.rpc.eureka.client.refresh.interval1sEureka client's microservice list refresh interval (recommended to use the default value)
wds.linkis.rpc.eureka.client.refresh.wait.time.max1mRefresh maximum waiting time (recommended to use the default value)
wds.linkis.rpc.receiver.asyn.consumer.thread.max10Maximum number of threads for Receiver Consumer (If there are many online users, it is recommended to increase this parameter)
wds.linkis.rpc.receiver.asyn.consumer.freeTime.max2mReceiver Consumer maximum idle time
wds.linkis.rpc.receiver.asyn.queue.size.max1000The maximum buffer size of the Receiver consumption queue (If there are many online users, it is recommended to increase this parameter)
wds.linkis.rpc.sender.asyn.consumer.thread.max", 5Maximum number of threads for Sender Consumer
wds.linkis.rpc.sender.asyn.consumer.freeTime.max2mSender Consumer maximum idle time
wds.linkis.rpc.sender.asyn.queue.size.max300Maximum buffer size of Sender consumption queue

2. Calculate Governance Configuration

2.1 Entrance Configuration

parameter namedefault valuedescription
wds.linkis.spark.engine.version2.4.3When the user submits a script without specifying a version, the default Spark version used
wds.linkis.hive.engine.version1.2.1When the user submits a script without specifying a version, the default Hive version used
wds.linkis.python.engine.versionpython2When the user submits a script without specifying a version, the default Python version used
wds.linkis.jdbc.engine.version4When the user submits a script without specifying a version, the default JDBC version used
wds.linkis.shell.engine.version1When the user submits a script without specifying a version, the default Shell version used
wds.linkis.appconn.engine.versionv1When the user submits a script without specifying a version, the default AppConn version used
wds.linkis.entrance.scheduler.maxParallelismUsers1000The maximum number of concurrent users supported by Entrance
wds.linkis.entrance.job.persist.wait.max5mThe maximum time that Entrance waits for JobHistory to persist a Job
wds.linkis.entrance.config.log.pathNoneIf not configured, the value of wds.linkis.filesystem.hdfs.root.path is used by default
wds.linkis.default.requestApplication.nameIDEDefault submission system when no submission system is specified
wds.linkis.default.runTypesqldefault script type when no script type is specified
wds.linkis.warn.log.excludeorg.apache,hive.ql,hive.metastore,com.netflix,org.apacheReal-time WARN level logs not output to the client by default
wds.linkis.log.excludeorg.apache,hive.ql,hive.metastore,com.netflix,org.apache,com.webankReal-time INFO-level logs that are not output to the client by default
wds.linkis.instance3The default number of concurrent jobs for each engine
wds.linkis.max.ask.executor.time5mMaximum time to ask LinkisManager for available EngineConn
wds.linkis.hive.special.log.includeorg.apache.hadoop.hive.ql.exec.TaskWhich logs are not filtered by default when pushing Hive logs to the client
wds.linkis.spark.special.log.includeorg.apache.linkis.engine.spark.utils.JobProgressUtilWhich logs are not filtered by default when pushing Spark logs to the client
wds.linkis.entrance.shell.danger.check.enabledfalseWhether to check and intercept Shell dangerous syntax
wds.linkis.shell.danger.usagerm,sh,find,kill,python,for,source,hdfs,hadoop,spark-sql,spark-submit,pyspark,spark-shell,hive,yarnShell default Dangerous Grammar
wds.linkis.shell.white.usagecd,lsShell whitelist syntax
wds.linkis.sql.default.limit5000SQL default maximum number of returned result set rows

2.2 EngineConn Configuration

parameter namedefault valuedescription
wds.linkis.engineconn.resultSet.default.store.pathhdfs:///tmpJob result set default storage path
wds.linkis.engine.resultSet.cache.max0kWhen the size of the result set is lower than the specified value, the EngineConn will return to the Entrance directly without placing the disk
wds.linkis.engine.default.limit5000
wds.linkis.engine.lock.expire.time120000The maximum idle time of the engine lock, that is, after the Entrance applies for the lock, how long it will be released without submitting the code to EngineConn
wds.linkis.engineconn.ignore.wordsorg.apache.spark.deploy.yarn.ClientWhen Engine pushes logs to Entrance, the logs ignored by default
wds.linkis.engineconn.pass.wordsorg.apache.hadoop.hive.ql.exec.TaskWhen Engine pushes logs to Entrance, the default logs must be pushed
wds.linkis.engineconn.heartbeat.time3mDefault heartbeat interval from EngineConn to LinkisManager
wds.linkis.engineconn.max.free.time1hMaximum idle time of EngineConn

2.3 EngineConnManager Configuration

parameter namedefault valuedescription
wds.linkis.ecm.memory.max80gECM's maximum bootable EngineConn memory
wds.linkis.ecm.cores.max50ECM's maximum number of CPUs that can start EngineConn
wds.linkis.ecm.engineconn.instances.max50The maximum number of EngineConn that can be started, it is generally recommended to set the same as wds.linkis.ecm.cores.max
wds.linkis.ecm.protected.memory4gThe protected memory of ECM, that is, the memory used by ECM to start EngineConn cannot exceed wds.linkis.ecm.memory.max - wds.linkis.ecm.protected.memory
wds.linkis.ecm.protected.cores.max2The number of ECM protected CPUs, the meaning is the same as wds.linkis.ecm.protected.memory
wds.linkis.ecm.protected.engine.instances2Number of protected instances of ECM
wds.linkis.engineconn.wait.callback.pid3sWaiting time for EngineConn to return pid

2.4 LinkisManager Configuration

parameter namedefault valuedescription
wds.linkis.manager.am.engine.start.max.time"10mThe maximum start time for LinkisManager to start a new EngineConn
wds.linkis.manager.am.engine.reuse.max.time5mThe maximum selection time for LinkisManager to reuse an existing EngineConn
wds.linkis.manager.am.engine.reuse.count.limit10The maximum number of polls for LinkisManager to reuse an existing EngineConn
wds.linkis.multi.user.engine.typesjdbc,es,prestoWhen LinkisManager reuses an existing EngineConn, which engine users are not used as reuse rules
wds.linkis.rm.instance10Default maximum number of instances per engine per user
wds.linkis.rm.yarnqueue.cores.max150The maximum number of cores used by each user in each engine queue
wds.linkis.rm.yarnqueue.memory.max450gThe maximum memory used by each user in each engine
wds.linkis.rm.yarnqueue.instance.max30The maximum number of applications launched by each user in the queue of each engine

3. Engine Configuration

3.1 JDBC Engine Configuration

parameter namedefault valuedescription
wds.linkis.jdbc.default.limit5000The default maximum number of returned result set rows
wds.linkis.jdbc.support.dbsmysql=>com.mysql.jdbc.Driver,postgresql=>org.postgresql.Driver,oracle=>oracle.jdbc.driver.OracleDriver,hive2=>org.apache.hive .jdbc.HiveDriver,presto=>com.facebook.presto.jdbc.PrestoDriverJDBC engine supported driver
wds.linkis.engineconn.jdbc.concurrent.limit100Maximum number of parallel SQL executions

3.2 Python Engine Configuration

parameter namedefault valuedescription
pythonVersion/appcom/Install/anaconda3/bin/pythonPython command path
python.pathNoneSpecifies an additional path for Python, which only accepts shared storage paths

3.3 Spark Engine Configuration

parameter namedefault valuedescription
wds.linkis.engine.spark.language-repl.init.time30sMaximum initialization time for Scala and Python command interpreters
PYSPARK_DRIVER_PYTHONpythonPython command path
wds.linkis.server.spark-submitspark-submitspark-submit command path

4. PublicEnhancements Configuration

4.1 BML Configuration

parameter namedefault valuedescription
wds.linkis.bml.dws.versionv1Version number of Linkis Restful request
wds.linkis.bml.auth.token.keyValidation-CodePassword-free token-key for BML requests
wds.linkis.bml.auth.token.valueBML-AUTHPassword-free token-value for BML requests
wds.linkis.bml.hdfs.prefix/tmp/linkisPrefix file path where BML files are stored on hdfs

4.2 Metadata Configuration

parameter namedefault valuedescription
hadoop.config.dir/appcom/config/hadoop-configIf it does not exist, the value of the environment variable HADOOP_CONF_DIR will be used by default
hive.config.dir/appcom/config/hive-configIf it does not exist, it will default to the value of the environment variable HIVE_CONF_DIR
hive.meta.urlNoneThe URL of the HiveMetaStore database. If hive.config.dir is not configured, the value must be configured
hive.meta.usernoneuser of the HiveMetaStore database
hive.meta.passwordnonepassword for the HiveMetaStore database

4.3 JobHistory Configuration

parameter namedefault valuedescription
wds.linkis.jobhistory.adminNoneThe default Admin account, which is used to specify which users can view everyone's execution history

4.4 FileSystem Configuration

parameter namedefault valuedescription
wds.linkis.filesystem.root.pathfile:///tmp/linkis/user's local Linux root directory
wds.linkis.filesystem.hdfs.root.pathhdfs:///tmp/user's HDFS root directory
wds.linkis.workspace.filesystem.hdfsuserrootpath.suffix/linkis/The first-level prefix after the user's HDFS root directory. The actual root directory of the user is: ${hdfs.root.path}\${user}\${ hdfsuserrootpath.suffix}
wds.linkis.workspace.resultset.download.is.limittrueWhen the client downloads the result set, whether to limit the number of downloads
wds.linkis.workspace.resultset.download.maxsize.csv5000When the result set is downloaded as a CSV file, the limited number of downloads
wds.linkis.workspace.resultset.download.maxsize.excel5000When the result set is downloaded as an Excel file, the limited number of downloads
wds.linkis.workspace.filesystem.get.timeout2000LThe maximum timeout for requests to the underlying filesystem. (If the performance of your HDFS or Linux machine is low, it is recommended to increase the query number)

4.5 UDF Configuration

parameter namedefault valuedescription
wds.linkis.udf.share.path/mnt/bdap/udfShared UDF storage path, it is recommended to set it to HDFS path

5. MicroService Configuration

5.1 Gateway Configuration

parameter namedefault valuedescription
wds.linkis.gateway.conf.enable.proxy.userfalseWhether to enable the proxy user mode, if enabled, the login user's request will be proxied to the proxy user for execution
wds.linkis.gateway.conf.proxy.user.configproxy.propertiesstorage file for proxy rules
wds.linkis.gateway.conf.proxy.user.scan.interval600000Refresh interval for proxy files
wds.linkis.gateway.conf.enable.token.authfalseWhether to enable the Token login mode, if enabled, access to Linkis in the form of token is allowed
wds.linkis.gateway.conf.token.auth.configtoken.propertiesStorage file for Token rules
wds.linkis.gateway.conf.token.auth.scan.interval600000Token file refresh interval
wds.linkis.gateway.conf.url.pass.auth/dws/By default, no request for login verification
wds.linkis.gateway.conf.enable.ssofalseWhether to enable SSO user login mode
wds.linkis.gateway.conf.sso.interceptorNoneIf the SSO login mode is enabled, the user needs to implement SSOInterceptor to jump to the SSO login page
wds.linkis.admin.userhadooplist of admin users
wds.linkis.login_encrypt.enablefalseWhen the user logs in, whether to enable RSA encrypted transmission
wds.linkis.enable.gateway.authfalseWhether to enable the Gateway IP whitelist mechanism
wds.linkis.gateway.auth.fileauth.txtIP whitelist storage file

6. Data Source and Metadata Service Configuration

6.1 Metadata Service Configuration

Introduced VersionParameter NameDefault ValueDescription
v1.1.0wds.linkis.server.mdm.service.lib.dir/lib/linkis-public-enhancements/linkis-ps-metadatamanager/serviceSet the relative path of the data source jar package that needs to be loaded, which will be reflected call
v1.1.0wds.linkis.server.mdm.service.instance.expire-in-seconds60Set the expiration time for loading sub-services, the service will not be loaded after this time
v1.1.0wds.linkis.server.dsm.app.namelinkis-ps-data-source-managerSet the service for obtaining data source information
v1.1.0wds.linkis.server.mdm.service.kerberos.principlehadoop/HOST@EXAMPLE.COMset kerberos principle for linkis-metadata hive service
v1.1.0wds.linkis.server.mdm.service.userhadoopset the access user of hive service
v1.1.0wds.linkis.server.mdm.service.kerberos.krb5.path""Set the kerberos krb5 path used by the hive service
v1.1.0wds.linkis.server.mdm.service.temp.locationclasspath:/tmpSet the temporary path of kafka and hive
v1.1.0wds.linkis.server.mdm.service.sql.drivercom.mysql.jdbc.DriverSet the driver of mysql service
v1.1.0wds.linkis.server.mdm.service.sql.urljdbc:mysql://%s:%s/%sSet the url format of mysql service
v1.1.0wds.linkis.server.mdm.service.sql.connect.timeout3000Set the connection timeout for mysql service to connect to mysql service
v1.1.0wds.linkis.server.mdm.service.sql.socket.timeout6000Set the socket timeout time for the mysql service to open the mysql service
v1.1.0wds.linkis.server.mdm.service.temp.location/tmp/keytabSet the local temporary storage path of the service, mainly to store the authentication files downloaded from the bml material service

7. Common Scene Parameters

7.1 Open Test Mode

The development process requires a password-free interface, which can be replaced or appended to linkis.properties

parameter namedefault valuedescription
wds.linkis.test.modefalseWhether to enable debugging mode, if set to true, all microservices support password-free login, and all EngineConn open remote debugging ports
wds.linkis.test.userhadoopWhen wds.linkis.test.mode=true, the default login user for password-free login

7.2 Login User Settings

Apache Linkis uses configuration files to manage admin users by default, and this configuration can be replaced or appended to linkis-mg-gateway.properties. For multi-user access LDAP implementation.

parameter namedefault valuedescription
wds.linkis.admin.userhadoopadmin username
wds.linkis.admin.password123456Admin user password

7.3 LDAP Settings

Apache Linkis can access LDAP through parameters to achieve multi-user management, and this configuration can be replaced or added in linkis-mg-gateway.properties.

parameter namedefault valuedescription
wds.linkis.ldap.proxy.urlNoneLDAP URL address
wds.linkis.ldap.proxy.baseDNNoneLDAP baseDN address
wds.linkis.ldap.proxy.userNameFormatNone

7.4 Turn off Resource Checking

Apache Linkis sometimes debugs exceptions when submitting tasks, such as: insufficient resources; you can replace or append this configuration in linkis-cg-linkismanager.properties.

parameter namedefault valuedescription
wds.linkis.manager.rm.request.enabletrueresource check

7.5 Turn on Engine Debugging

Apache Linkis EC can enable debugging mode, and this configuration can be replaced or added in linkis-cg-linkismanager.properties.

parameter namedefault valuedescription
wds.linkis.engineconn.debug.enablefalseWhether to enable engine debugging

7.6 Hive Metadata Configuration

The public-service service of Apache Linkis needs to read hive metadata; this configuration can be replaced or appended in linkis-ps-publicservice.properties.

parameter namedefault valuedescription
hive.meta.urlNoneThe URL of the HiveMetaStore database.
hive.meta.usernoneuser of the HiveMetaStore database
hive.meta.passwordnonepassword for the HiveMetaStore database

7.7 Linkis Database Configuration

Apache Linkis access uses Mysql as data storage by default, you can replace or append this configuration in linkis.properties.

parameter namedefault valuedescription
wds.linkis.server.mybatis.datasource.urlNoneDatabase connection string, for example: jdbc:mysql://127.0.0.1:3306/dss?characterEncoding=UTF-8
wds.linkis.server.mybatis.datasource.usernamenonedatabase user name, for example: root
wds.linkis.server.mybatis.datasource.passwordNoneDatabase password, for example: root

7.8 Linkis Session Cache Configuration

Apache Linkis supports using redis for session sharing; this configuration can be replaced or appended in linkis.properties.

parameter namedefault valuedescription
linkis.session.redis.cache.enabledNoneWhether to enable
linkis.session.redis.host127.0.0.1hostname
linkis.session.redis.port6379Port, eg
linkis.session.redis.passwordNonepassword

7.9 Linkis Module Development Configuration

When developing Apache Linkis, you can use this parameter to customize the database, Rest interface, and entity objects of the loading module; you can modify it in linkis-ps-publicservice.properties, and use commas to separate multiple modules.

parameter namedefault valuedescription
wds.linkis.server.restful.scan.packagesnonerestful scan packages, for example: org.apache.linkis.basedatamanager.server.restful
wds.linkis.server.mybatis.mapperLocationsNoneMybatis mapper file path, for example: classpath:org/apache/linkis/basedatamanager/server/dao/mapper/.xml
wds.linkis.server.mybatis.typeAliasesPackageNoneEntity alias scanning package, for example: org.apache.linkis.basedatamanager.server.domain
wds.linkis.server.mybatis.BasePackageNoneDatabase dao layer scan, for example: org.apache.linkis.basedatamanager.server.dao

7.10 Linkis Module Development Configuration

This parameter can be used to customize the route of loading modules during Apache Linkis development; it can be modified in linkis.properties, and commas are used to separate multiple modules.

parameter namedefault valuedescription
wds.linkis.gateway.conf.publicservice.listcs,contextservice,data-source-manager,metadataQuery,metadatamanager,query,jobhistory,application,configuration,filesystem,udf,variable,microservice,errorcode,bml,datasource,basedata -managerpublicservice services support routing modules

7.11 Linkis File System And Material Storage Path

This parameter can be used to customize the route of loading modules during Apache Linkis development; it can be modified in linkis.properties, and commas are used to separate multiple modules.

parameter namedefault valuedescription
wds.linkis.filesystem.root.pathfile:///tmp/linkis/Local user directory, a folder named after the user name needs to be created under this directory
wds.linkis.filesystem.hdfs.root.pathhdfs:///tmp/HDFS user directory
wds.linkis.bml.is.hdfstrueWhether to enable hdfs
wds.linkis.bml.hdfs.prefix/apps-datahdfs path
wds.linkis.bml.local.prefix/apps-datalocal path