跑一个程序spark提示报错(具体描述见下),大概的意思就是说要refresh table,请问应该怎么refresh呢?
The underlying files may have been updated.
You can explicitly invalidate the cache in Spark by running 'REFRESH TABLE tableName' command in SQL or by recreating the Dataset/DataFrame involved.
谢谢
3个回答
import org.apache.spark.sql.hive.HiveContext
hiveContext.refreshTable("tableName")