我曾经尝试过多种多种方案,但是在查找资料得知,hostname的不当命名==(计算机名包含下划线(underscore))==可能会导致spark_shell.cmd
无法正常启动
一个可行的命名(比如,cxxuWin11
在启动是会有类似如下行的输出http://cxxuWin11:4040
当然,也有可能是其他原因引起的,本方案不保证成功
PS C:\Users\cxxu> spark-shell.cmd Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Spark context Web UI available at http://cxxuWin11:4040 Spark context available as 'sc' (master = local[*], app id = local-1639222458451). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 3.0.3 /_/ Using Scala version 2.12.10 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_181) Type in expressions to have them evaluated. Type :help for more information. scala> 21/12/11 19:34:32 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped q