coming from SAS I want to join multiple dataframes in one SQL join in PySpark. In SAS, thats possible, however, i get the sense that in Pyspark it is not. My script looks like this:
A.createOrReplaceTempView("A")
B.createOrReplaceTempView("B")
C.createOrReplaceTempView("C")
D = spark.sql("select a.*, b.VAR_B, C.VAR_C
from A a left join B b on a.VAR == b.VAR
left join C c on a.VAR == c.VAR")
Is that possible in PySpark? Thank you!