I want to change my spark session from 'pyspark.sql.connect.dataframe.DataFrame' to 'pyspark.sql.dataframe.DataFrame' so that I can run StringIndexer and VectorAssembler.
If I run it in pyspark.sql.connect.dataframe.DataFrame, I'm getting an Assertion error with:
File /databricks/python/lib/python3.12/site-packages/pyspark/ml/wrapper.py:87, in JavaWrapper._new_java_obj(java_class, *args) 84 from pyspark.core.context import SparkContext 86 sc = SparkContext._active_spark_context ---> 87 assert sc is not None
Thank you!