[root@Host1 ~]# /Hadoop/hadoop-3.3.6/sbin/start-dfs.sh
Starting namenodes on [Host1]
Starting datanodes
Starting secondary namenodes [Host1]
[root@Host1 ~]# jps
2288 NameNode
2917 Jps
2760 SecondaryNameNode
[root@Host1 ~]# /Spark/spark-3.5.1-bin-hadoop3/sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /Spark/spark-3.5.1-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.master.Master-1-Host1.out
Host2: starting org.apache.spark.deploy.worker.Worker, logging to /Spark/spark-3.5.1-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Host2.out
Host1: starting org.apache.spark.deploy.worker.Worker, logging to /Spark/spark-3.5.1-bin-hadoop3/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-Host1.out
[root@Host1 ~]# jps
2288 NameNode
3040 Worker
3090 Jps
2760 SecondaryNameNode
2954 Master
[root@Host1 ~]# jupyter lab
[I 2024-03-09 19:55:55.155 ServerApp] jupyter_lsp | extension was successfully linked.
[I 2024-03-09 19:55:55.161 ServerApp] jupyter_server_terminals | extension was successfully linked.
[I 2024-03-09 19:55:55.169 ServerApp] jupyterlab | extension was successfully linked.
[I 2024-03-09 19:55:55.169 ServerApp] jupyterlab_code_formatter | extension was successfully linked.
[W 2024-03-09 19:55:55.178 ServerApp] notebook_dir is deprecated, use root_dir
[W 2024-03-09 19:55:55.179 ServerApp] ServerApp.password config is deprecated in 2.0. Use PasswordIdentityProvider.hashed_password.
[I 2024-03-09 19:55:55.180 ServerApp] notebook | extension was successfully linked.
[I 2024-03-09 19:55:55.948 ServerApp] notebook_shim | extension was successfully linked.
[I 2024-03-09 19:55:56.011 ServerApp] notebook_shim | extension was successfully loaded.
[I 2024-03-09 19:55:56.017 ServerApp] jupyter_lsp | extension was successfully loaded.
[I 2024-03-09 19:55:56.025 ServerApp] jupyter_server_terminals | extension was successfully loaded.
[I 2024-03-09 19:55:56.035 LabApp] JupyterLab extension loaded from /usr/local/lib/python3.9/site-packages/jupyterlab
[I 2024-03-09 19:55:56.035 LabApp] JupyterLab application directory is /usr/local/share/jupyter/lab
[I 2024-03-09 19:55:56.037 LabApp] Extension Manager is 'pypi'.
[I 2024-03-09 19:55:56.087 ServerApp] jupyterlab | extension was successfully loaded.
[I 2024-03-09 19:55:56.088 ServerApp] Registered jupyterlab_code_formatter server extension
[I 2024-03-09 19:55:56.088 ServerApp] jupyterlab_code_formatter | extension was successfully loaded.
[I 2024-03-09 19:55:56.093 ServerApp] notebook | extension was successfully loaded.
[I 2024-03-09 19:55:56.094 ServerApp] Serving notebooks from local directory: /Spark/source
[I 2024-03-09 19:55:56.094 ServerApp] Jupyter Server 2.12.5 is running at:
[I 2024-03-09 19:55:56.094 ServerApp] http://Host1:8888/lab
[I 2024-03-09 19:55:56.094 ServerApp] http://127.0.0.1:8888/lab
[I 2024-03-09 19:55:56.094 ServerApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
[I 2024-03-09 19:55:56.138 ServerApp] Skipped non-installed server(s): bash-language-server, dockerfile-language-server-nodejs, javascript-typescript-langserver, jedi-language-server, julia-language-server, pyright, python-language-server, python-lsp-server, r-languageserver, sql-language-server, texlab, typescript-language-server, unified-language-server, vscode-css-languageserver-bin, vscode-html-languageserver-bin, vscode-json-languageserver-bin, yaml-language-server
[I 2024-03-09 19:56:02.133 ServerApp] 302 GET / (@192.168.245.1) 1.87ms
[I 2024-03-09 19:56:02.137 LabApp] 302 GET /lab? (@192.168.245.1) 1.80ms
[I 2024-03-09 19:56:07.142 ServerApp] User 4d73778b7af049319d3ccab2bdf83f43 logged in.
[I 2024-03-09 19:56:07.144 ServerApp] 302 POST /login?next=%2Flab%3F (4d73778b7af049319d3ccab2bdf83f43@192.168.245.1) 225.05ms
[W 2024-03-09 19:56:08.311 LabApp] Could not determine jupyterlab build status without nodejs
^C[I 2024-03-09 20:04:19.425 ServerApp] interrupted
[I 2024-03-09 20:04:19.426 ServerApp] Serving notebooks from local directory: /Spark/source
0 active kernels
Jupyter Server 2.12.5 is running at:
http://Host1:8888/lab
http://127.0.0.1:8888/lab
- 连接Spark集群的master主机,在Spark安装目录下,运行命令(注意目录的相对位置要正确):
/Spark/spark-3.5.1-bin-hadoop3/bin/pyspark --master spark://Host1:7077文章来源:https://www.toymoban.com/news/detail-839396.html
文章来源地址https://www.toymoban.com/news/detail-839396.html
到了这里,关于hadoop spark jupyterbook 打开过程的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!