Amazon EMR release 5.14.0 - Amazon EMR

Amazon EMR release 5.14.0

5.14.0 application versions

The following applications are supported in this release: Flink, Ganglia, HBase, HCatalog, Hadoop, Hive, Hue, JupyterHub, Livy, MXNet, Mahout, Oozie, Phoenix, Pig, Presto, Spark, Sqoop, Tez, Zeppelin, and ZooKeeper.

The table below lists the application versions available in this release of Amazon EMR and the application versions in the preceding three Amazon EMR releases (when applicable).

For a comprehensive history of application versions for each release of Amazon EMR, see the following topics:

Application version information
emr-5.14.0 emr-5.13.1 emr-5.13.0 emr-5.12.3
AWS SDK for Java 1.11.2971.11.2971.11.2971.11.267
Python 2.7, 3.42.7, 3.42.7, 3.42.7, 3.4
Scala 2.11.82.11.82.11.82.11.8
AmazonCloudWatchAgent - - - -
Delta - - - -
Flink1.4.21.4.01.4.01.4.0
Ganglia3.7.23.7.23.7.23.7.2
HBase1.4.21.4.21.4.21.4.0
HCatalog2.3.22.3.22.3.22.3.2
Hadoop2.8.32.8.32.8.32.8.3
Hive2.3.22.3.22.3.22.3.2
Hudi - - - -
Hue4.1.04.1.04.1.04.1.0
Iceberg - - - -
JupyterEnterpriseGateway - - - -
JupyterHub0.8.1 - - -
Livy0.4.00.4.00.4.00.4.0
MXNet1.1.01.0.01.0.01.0.0
Mahout0.13.00.13.00.13.00.13.0
Oozie4.3.04.3.04.3.04.3.0
Phoenix4.13.04.13.04.13.04.13.0
Pig0.17.00.17.00.17.00.17.0
Presto0.1940.1940.1940.188
Spark2.3.02.3.02.3.02.2.1
Sqoop1.4.71.4.61.4.61.4.6
TensorFlow - - - -
Tez0.8.40.8.40.8.40.8.4
Trino (PrestoSQL) - - - -
Zeppelin0.7.30.7.30.7.30.7.3
ZooKeeper3.4.103.4.103.4.103.4.10

5.14.0 release notes

The following release notes include information for Amazon EMR release 5.14.0. Changes are relative to 5.13.0.

Initial release date: June 4, 2018

Upgrades
  • Upgraded Apache Flink to 1.4.2

  • Upgraded Apache MXnet to 1.1.0

  • Upgraded Apache Sqoop to 1.4.7

New features
  • Added JupyterHub support. For more information, see JupyterHub.

Changes, enhancements, and resolved issues
  • EMRFS

    • The userAgent string in requests to Amazon S3 has been updated to contain the user and group information of the invoking principal. This can be used with AWS CloudTrail logs for more comprehensive request tracking.

  • HBase

    • Included HBASE-20447, which addresses an issue that could cause cache issues, especially with split Regions.

  • MXnet

    • Added OpenCV libraries.

  • Spark

    • When Spark writes Parquet files to an Amazon S3 location using EMRFS, the FileOutputCommitter algorithm has been updated to use version 2 instead of version 1. This reduces the number of renames, which improves application performance. This change does not affect:

      • Applications other than Spark.

      • Applications that write to other file systems, such as HDFS (which still use version 1 of FileOutputCommitter).

      • Applications that use other output formats, such as text or csv, that already use EMRFS direct write.

Known issues
  • JupyterHub

    • Using configuration classifications to set up JupyterHub and individual Jupyter notebooks when you create a cluster is not supported. Edit the jupyterhub_config.py file and jupyter_notebook_config.py files for each user manually. For more information, see Configuring JupyterHub.

    • JupyterHub fails to start on clusters within a private subnet, failing with the message Error: ENOENT: no such file or directory, open '/etc/jupyter/conf/server.crt' . This is caused by an error in the script that generates self-signed certificates. Use the following workaround to generate self-signed certificates. All commands are executed while connected to the primary node.

      1. Copy the certificate generation script from the container to the primary node:

        sudo docker cp jupyterhub:/tmp/gen_self_signed_cert.sh ./
      2. Use a text editor to change line 23 to change public hostname to local hostname as shown below:

        local hostname=$(curl -s $EC2_METADATA_SERVICE_URI/local-hostname)
      3. Run the script to generate self-signed certificates:

        sudo bash ./gen_self_signed_cert.sh
      4. Move the certificate files that the script generates to the /etc/jupyter/conf/ directory:

        sudo mv /tmp/server.crt /tmp/server.key /etc/jupyter/conf/

      You can tail the jupyter.log file to verify that JupyterHub restarted and is returning a 200 response code. For example:

      tail -f /var/log/jupyter/jupyter.log

      This should return a response similar to the following:

      # [I 2018-06-14 18:56:51.356 JupyterHub app:1581] JupyterHub is now running at https://:9443/ # 19:01:51.359 - info: [ConfigProxy] 200 GET /api/routes
  • After the primary node reboots or the instance controller restarts, the CloudWatch metrics will not be collected and the automatic scaling feature will not be available in Amazon EMR version 5.14.0, 5.15.0, or 5.16.0. This issue is fixed in Amazon EMR 5.17.0.

5.14.0 component versions

The components that Amazon EMR installs with this release are listed below. Some are installed as part of big-data application packages. Others are unique to Amazon EMR and installed for system processes and features. These typically start with emr or aws. Big-data application packages in the most recent Amazon EMR release are usually the latest version found in the community. We make community releases available in Amazon EMR as quickly as possible.

Some components in Amazon EMR differ from community versions. These components have a version label in the form CommunityVersion-amzn-EmrVersion. The EmrVersion starts at 0. For example, if open source community component named myapp-component with version 2.2 has been modified three times for inclusion in different Amazon EMR releases, its release version is listed as 2.2-amzn-2.

Component Version Description
aws-sagemaker-spark-sdk1.0.1Amazon SageMaker Spark SDK
emr-ddb4.5.0Amazon DynamoDB connector for Hadoop ecosystem applications.
emr-goodies2.4.0Extra convenience libraries for the Hadoop ecosystem.
emr-kinesis3.4.0Amazon Kinesis connector for Hadoop ecosystem applications.
emr-s3-dist-cp2.10.0Distributed copy application optimized for Amazon S3.
emrfs2.23.0Amazon S3 connector for Hadoop ecosystem applications.
flink-client1.4.2Apache Flink command line client scripts and applications.
ganglia-monitor3.7.2Embedded Ganglia agent for Hadoop ecosystem applications along with the Ganglia monitoring agent.
ganglia-metadata-collector3.7.2Ganglia metadata collector for aggregating metrics from Ganglia monitoring agents.
ganglia-web3.7.1Web application for viewing metrics collected by the Ganglia metadata collector.
hadoop-client2.8.3-amzn-1Hadoop command-line clients such as 'hdfs', 'hadoop', or 'yarn'.
hadoop-hdfs-datanode2.8.3-amzn-1HDFS node-level service for storing blocks.
hadoop-hdfs-library2.8.3-amzn-1HDFS command-line client and library
hadoop-hdfs-namenode2.8.3-amzn-1HDFS service for tracking file names and block locations.
hadoop-httpfs-server2.8.3-amzn-1HTTP endpoint for HDFS operations.
hadoop-kms-server2.8.3-amzn-1Cryptographic key management server based on Hadoop's KeyProvider API.
hadoop-mapred2.8.3-amzn-1MapReduce execution engine libraries for running a MapReduce application.
hadoop-yarn-nodemanager2.8.3-amzn-1YARN service for managing containers on an individual node.
hadoop-yarn-resourcemanager2.8.3-amzn-1YARN service for allocating and managing cluster resources and distributed applications.
hadoop-yarn-timeline-server2.8.3-amzn-1Service for retrieving current and historical information for YARN applications.
hbase-hmaster1.4.2Service for an HBase cluster responsible for coordination of Regions and execution of administrative commands.
hbase-region-server1.4.2Service for serving one or more HBase regions.
hbase-client1.4.2HBase command-line client.
hbase-rest-server1.4.2Service providing a RESTful HTTP endpoint for HBase.
hbase-thrift-server1.4.2Service providing a Thrift endpoint to HBase.
hcatalog-client2.3.2-amzn-2The 'hcat' command line client for manipulating hcatalog-server.
hcatalog-server2.3.2-amzn-2Service providing HCatalog, a table and storage management layer for distributed applications.
hcatalog-webhcat-server2.3.2-amzn-2HTTP endpoint providing a REST interface to HCatalog.
hive-client2.3.2-amzn-2Hive command line client.
hive-hbase2.3.2-amzn-2Hive-hbase client.
hive-metastore-server2.3.2-amzn-2Service for accessing the Hive metastore, a semantic repository storing metadata for SQL on Hadoop operations.
hive-server22.3.2-amzn-2Service for accepting Hive queries as web requests.
hue-server4.1.0Web application for analyzing data using Hadoop ecosystem applications
jupyterhub0.8.1Multi-user server for Jupyter notebooks
livy-server0.4.0-incubatingREST interface for interacting with Apache Spark
mahout-client0.13.0Library for machine learning.
mxnet1.1.0A flexible, scalable, and efficient library for deep learning.
mysql-server5.5.54+MySQL database server.
nvidia-cuda9.1.85Nvidia drivers and Cuda toolkit
oozie-client4.3.0Oozie command-line client.
oozie-server4.3.0Service for accepting Oozie workflow requests.
opencv3.4.0Open Source Computer Vision Library.
phoenix-library4.13.0-HBase-1.4The phoenix libraries for server and client
phoenix-query-server4.13.0-HBase-1.4A light weight server providing JDBC access as well as Protocol Buffers and JSON format access to the Avatica API
presto-coordinator0.194Service for accepting queries and managing query execution among presto-workers.
presto-worker0.194Service for executing pieces of a query.
pig-client0.17.0Pig command-line client.
r3.4.1The R Project for Statistical Computing
spark-client2.3.0Spark command-line clients.
spark-history-server2.3.0Web UI for viewing logged events for the lifetime of a completed Spark application.
spark-on-yarn2.3.0In-memory execution engine for YARN.
spark-yarn-slave2.3.0Apache Spark libraries needed by YARN slaves.
sqoop-client1.4.7Apache Sqoop command-line client.
tez-on-yarn0.8.4The tez YARN application and libraries.
webserver2.4.25+Apache HTTP server.
zeppelin-server0.7.3Web-based notebook that enables interactive data analytics.
zookeeper-server3.4.10Centralized service for maintaining configuration information, naming, providing distributed synchronization, and providing group services.
zookeeper-client3.4.10ZooKeeper command line client.

5.14.0 configuration classifications

Configuration classifications allow you to customize applications. These often correspond to a configuration XML file for the application, such as hive-site.xml. For more information, see Configure applications.

emr-5.14.0 classifications
Classifications Description

capacity-scheduler

Change values in Hadoop's capacity-scheduler.xml file.

container-log4j

Change values in Hadoop YARN's container-log4j.properties file.

core-site

Change values in Hadoop's core-site.xml file.

emrfs-site

Change EMRFS settings.

flink-conf

Change flink-conf.yaml settings.

flink-log4j

Change Flink log4j.properties settings.

flink-log4j-yarn-session

Change Flink log4j-yarn-session.properties settings.

flink-log4j-cli

Change Flink log4j-cli.properties settings.

hadoop-env

Change values in the Hadoop environment for all Hadoop components.

hadoop-log4j

Change values in Hadoop's log4j.properties file.

hadoop-ssl-server

Change hadoop ssl server configuration

hadoop-ssl-client

Change hadoop ssl client configuration

hbase

Amazon EMR-curated settings for Apache HBase.

hbase-env

Change values in HBase's environment.

hbase-log4j

Change values in HBase's hbase-log4j.properties file.

hbase-metrics

Change values in HBase's hadoop-metrics2-hbase.properties file.

hbase-policy

Change values in HBase's hbase-policy.xml file.

hbase-site

Change values in HBase's hbase-site.xml file.

hdfs-encryption-zones

Configure HDFS encryption zones.

hdfs-site

Change values in HDFS's hdfs-site.xml.

hcatalog-env

Change values in HCatalog's environment.

hcatalog-server-jndi

Change values in HCatalog's jndi.properties.

hcatalog-server-proto-hive-site

Change values in HCatalog's proto-hive-site.xml.

hcatalog-webhcat-env

Change values in HCatalog WebHCat's environment.

hcatalog-webhcat-log4j2

Change values in HCatalog WebHCat's log4j2.properties.

hcatalog-webhcat-site

Change values in HCatalog WebHCat's webhcat-site.xml file.

hive-beeline-log4j2

Change values in Hive's beeline-log4j2.properties file.

hive-parquet-logging

Change values in Hive's parquet-logging.properties file.

hive-env

Change values in the Hive environment.

hive-exec-log4j2

Change values in Hive's hive-exec-log4j2.properties file.

hive-llap-daemon-log4j2

Change values in Hive's llap-daemon-log4j2.properties file.

hive-log4j2

Change values in Hive's hive-log4j2.properties file.

hive-site

Change values in Hive's hive-site.xml file

hiveserver2-site

Change values in Hive Server2's hiveserver2-site.xml file

hue-ini

Change values in Hue's ini file

httpfs-env

Change values in the HTTPFS environment.

httpfs-site

Change values in Hadoop's httpfs-site.xml file.

hadoop-kms-acls

Change values in Hadoop's kms-acls.xml file.

hadoop-kms-env

Change values in the Hadoop KMS environment.

hadoop-kms-log4j

Change values in Hadoop's kms-log4j.properties file.

hadoop-kms-site

Change values in Hadoop's kms-site.xml file.

jupyter-notebook-conf

Change values in Jupyter Notebook's jupyter_notebook_config.py file.

jupyter-hub-conf

Change values in JupyterHubs's jupyterhub_config.py file.

jupyter-sparkmagic-conf

Change values in Sparkmagic's config.json file.

livy-conf

Change values in Livy's livy.conf file.

livy-env

Change values in the Livy environment.

livy-log4j

Change Livy log4j.properties settings.

mapred-env

Change values in the MapReduce application's environment.

mapred-site

Change values in the MapReduce application's mapred-site.xml file.

oozie-env

Change values in Oozie's environment.

oozie-log4j

Change values in Oozie's oozie-log4j.properties file.

oozie-site

Change values in Oozie's oozie-site.xml file.

phoenix-hbase-metrics

Change values in Phoenix's hadoop-metrics2-hbase.properties file.

phoenix-hbase-site

Change values in Phoenix's hbase-site.xml file.

phoenix-log4j

Change values in Phoenix's log4j.properties file.

phoenix-metrics

Change values in Phoenix's hadoop-metrics2-phoenix.properties file.

pig-env

Change values in the Pig environment.

pig-properties

Change values in Pig's pig.properties file.

pig-log4j

Change values in Pig's log4j.properties file.

presto-log

Change values in Presto's log.properties file.

presto-config

Change values in Presto's config.properties file.

presto-env

Change values in Presto's presto-env.sh file.

presto-node

Change values in Presto's node.properties file.

presto-connector-blackhole

Change values in Presto's blackhole.properties file.

presto-connector-cassandra

Change values in Presto's cassandra.properties file.

presto-connector-hive

Change values in Presto's hive.properties file.

presto-connector-jmx

Change values in Presto's jmx.properties file.

presto-connector-kafka

Change values in Presto's kafka.properties file.

presto-connector-localfile

Change values in Presto's localfile.properties file.

presto-connector-mongodb

Change values in Presto's mongodb.properties file.

presto-connector-mysql

Change values in Presto's mysql.properties file.

presto-connector-postgresql

Change values in Presto's postgresql.properties file.

presto-connector-raptor

Change values in Presto's raptor.properties file.

presto-connector-redis

Change values in Presto's redis.properties file.

presto-connector-redshift

Change values in Presto's redshift.properties file.

presto-connector-tpch

Change values in Presto's tpch.properties file.

spark

Amazon EMR-curated settings for Apache Spark.

spark-defaults

Change values in Spark's spark-defaults.conf file.

spark-env

Change values in the Spark environment.

spark-hive-site

Change values in Spark's hive-site.xml file

spark-log4j

Change values in Spark's log4j.properties file.

spark-metrics

Change values in Spark's metrics.properties file.

sqoop-env

Change values in Sqoop's environment.

sqoop-oraoop-site

Change values in Sqoop OraOop's oraoop-site.xml file.

sqoop-site

Change values in Sqoop's sqoop-site.xml file.

tez-site

Change values in Tez's tez-site.xml file.

yarn-env

Change values in the YARN environment.

yarn-site

Change values in YARN's yarn-site.xml file.

zeppelin-env

Change values in the Zeppelin environment.

zookeeper-config

Change values in ZooKeeper's zoo.cfg file.

zookeeper-log4j

Change values in ZooKeeper's log4j.properties file.