I have the below command in databricks notebook which is in python.
batdf = spark.sql(f"""select cast((from_unixtime((timestamp/1000), 'yyyy-MM-dd HH:mm:ss')) as date) as event_date,(from_unixtime((timestamp/1000), 'yyyy-MM-dd HH:mm:ss')) as event_datetime, * from testable """)
srcRecCount = batdf.count()
I have one more cell in the same notebook which is in scala as below.
%scala
import java.time._
var srcRecCount: Long = 99999
val endPart = LocalDateTime.now()
val endPartDelta = endPart.toString.substring(0,19)
dbutils.notebook.exit(s"""{'date':'$endPartDelta', 'srcRecCount':'$srcRecCount'}""")
I want to access the variable srcRecCount from python cell into scala cell in databricks notebook. Could you please let me know if this is possible.