英文:
How to access the variables/functions in one notebook into other notebook in Databricks
问题
我有一个名为Configs的笔记本,其中包含所有SQL连接配置,如连接属性、jdbcUrl
、用户名等...
现在当我在另一个笔记本中使用以下代码:
dbutils.run.notebook("/Configs", 120)
它在另一个笔记本中的spark.read.jdbc()
处引发错误,错误信息为:
jdbcurl和connnectionproperties未声明。
如何在我的当前笔记本中访问它们?
英文:
I have Configs notebook which has all Sql Connection config like
Connection properties , jdbcUrl
, username and all...
Now When i use
dbutils.run.notebook("/Configs",120)
in another Notebook , Its throwing error at
spark.read.jdbc()
like
jdbcurl,connnectionproperties not declared .
How to access them into my current notebook ?
答案1
得分: 1
dbutils.run.notebook
在同一集群上作为独立作业执行笔记本。正如另一个答案中所提到的,您需要使用%run
来将一个笔记本的声明包含到另一个中。
这是一个可工作的示例。
英文:
dbutils.run.notebook
executes notebook as a separate job running on the same cluster. As mentioned in another answer, you need to use %run
to include declarations of one notebook into another (doc).
Here is a working example.
答案2
得分: 0
The other method to call the notebook is
%run <databricks_notebookpath>
英文:
The other method to call the notebook is
%run <databricks_notebookpath>
答案3
得分: 0
Sure, here's the translation:
使用以下魔术命令,类似于在Python中导入模块
%run <notebook_path>
英文:
use the below magic command, works similar to importing modules in python
%run <notebook_path>
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论