英文:
Spark Scala - store query result as scala integer type
问题
val max_val_frame = spark.sql(f"""select max(val) as max_val from temp_table_1""");
//将max_val_frame转换为整数的代码
var max_val = //从max_val_frame转换而来的值
var init_val = 1;
while (init_val <= max_val) {
print(init_val);
init_val = init_val + 1;
}
英文:
I am trying to take the result of a spark sql query, which is expected to be a single column/row integer value, and store it in a scala variable to use further down. Here is my code:
val max_val_frame = spark.sql(f"""select max(val) as max_val from temp_table_1""");
//code to cast max_val_frame to integer
var max_val = //casted value from max_val_frame
var init_val = 1;
while (init_val <= max_val) {
print(init_val);
init_val = init_val + 1;
}
Does anyone know how I can cast that spark dataframe object to a scala integer?
I have tried a number of things, including casting the entire dataframe to a string first, then an int, but it does not properly extract the value as an int.
答案1
得分: 1
你可以使用以下表达式:
val max_val = max_val_frame.collect().head.getInt(0)
首个 .head
或 (0)
获取第一行,然后可以使用 getInt(0)
或 getDouble(0)
解析行中索引为 0
的元素。
英文:
You can use the following expression:
val max_val = max_val_frame.collect().head.getInt(0)
The first .head
or (0)
takes the first row, then you can use getInt(0)
or getDouble(0)
to parse element with index 0
of the row.
通过集体智慧和协作来改善编程学习和解决问题的方式。致力于成为全球开发者共同参与的知识库,让每个人都能够通过互相帮助和分享经验来进步。
评论