Powered By Blogger

Monday, November 18, 2019

Loading external jar in spark shell



 basan:scala basan$ spark-shell --packages danielpes:spark-datetime-lite:0.2.0-s_2.11
Ivy Default Cache set to: /Users/basan/.ivy2/cache
The jars for the packages stored in: /Users/basan/.ivy2/jars
:: loading settings :: url = jar:file:/Users/basan/Documents/HadoopTraining/Spark/Installation/spark-2.4.4-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
danielpes#spark-datetime-lite added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-f0288cfe-9055-41c4-a6dc-e5e8e2e4163b;1.0
confs: [default]
found danielpes#spark-datetime-lite;0.2.0-s_2.11 in spark-packages
downloading https://dl.bintray.com/spark-packages/maven/danielpes/spark-datetime-lite/0.2.0-s_2.11/spark-datetime-lite-0.2.0-s_2.11.jar ...
[SUCCESSFUL ] danielpes#spark-datetime-lite;0.2.0-s_2.11!spark-datetime-lite.jar (955ms)
:: resolution report :: resolve 9221ms :: artifacts dl 957ms
:: modules in use:
danielpes#spark-datetime-lite;0.2.0-s_2.11 from spark-packages in [default]
---------------------------------------------------------------------
|                  |            modules            ||   artifacts   |
|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
|      default     |   1   |   1   |   1   |   0   ||   1   |   1   |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent-f0288cfe-9055-41c4-a6dc-e5e8e2e4163b
confs: [default]
1 artifacts copied, 0 already retrieved (35kB/7ms)
19/11/19 12:07:18 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
19/11/19 12:07:23 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
Spark context Web UI available at http://192.168.104.3:4041
Spark context available as 'sc' (master = local[*], app id = local-1574145444012).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.4
      /_/

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_191)
Type in expressions to have them evaluated.
Type :help for more information.

scala>

No comments:

Post a Comment