4

I'm trying to make a left outer join between two Kafka Stream using PySpark and Structured Streaming (Spark 2.3).

import os
import time

from pyspark.sql.types import *
from pyspark.sql.functions import from_json, col, struct, explode, get_json_object
from ast import literal_eval
from pyspark.sql import SparkSession
from pyspark.sql.functions import expr

os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.0 pyspark-shell'

spark = SparkSession \
    .builder \
    .appName("Spark Kafka Structured Streaming") \
    .getOrCreate()

schema_impressions = StructType() \
    .add("id_req", StringType()) \
    .add("ts_imp_request", TimestampType()) \
    .add("country", StringType()) \
    .add("TS_IMPRESSION", TimestampType()) 

schema_requests = StructType() \
    .add("id_req", StringType()) \
    .add("page", StringType()) \
    .add("conntype", StringType()) \
    .add("TS_REQUEST", TimestampType()) 

impressions = spark.readStream \
  .format("kafka") \
  .option("kafka.bootstrap.servers", "ip-ec2.internal:9092") \
  .option("subscribe", "ssp.datascience_impressions") \
  .load()

requests = spark \
  .readStream \
  .format("kafka") \
  .option("kafka.bootstrap.servers", "ip-ec2.internal:9092") \
  .option("subscribe", "ssp.datascience_requests") \
  .option("startingOffsets", "latest") \
  .load()

query_requests = requests \
        .select(col("timestamp"), col("key").cast("string"), from_json(col("value").cast("string"), schema_requests).alias("parsed")) \
        .select(col("timestamp").alias("timestamp_req"), "parsed.id_req", "parsed.page", "parsed.conntype", "parsed.TS_REQUEST") \
        .withWatermark("timestamp_req", "120 seconds") 

query_impressions = impressions \
        .select(col("timestamp"), col("key").cast("string"), from_json(col("value").cast("string"), schema_impressions).alias("parsed")) \
        .select(col("timestamp").alias("timestamp_imp"), col("parsed.id_req").alias("id_imp"), "parsed.ts_imp_request", "parsed.country", "parsed.TS_IMPRESSION") \
        .withWatermark("timestamp_imp", "120 seconds") 

query_requests.printSchema()        
query_impressions.printSchema()

> root  
|-- timestamp_req: timestamp (nullable = true)  
|-- id_req: string (nullable = true)  
|-- page: string (nullable = true)  
|-- conntype: string (nullable = true)  
|-- TS_REQUEST: timestamp (nullable = true)
> 
> root  |-- timestamp_imp: timestamp (nullable = true)  
|-- id_imp: string (nullable = true)  
|-- ts_imp_request: timestamp (nullable = true)  
|-- country: string (nullable = true)  
|-- TS_IMPRESSION: timestamp (nullable = true)

In resume, I will obtain data from two Kafka Streams, and in the next lines, I will try to make join using the IDs.

rawQuery = query_requests.join(query_impressions,  expr(""" 
    (id_req = id_imp AND 
    timestamp_imp >= timestamp_req AND 
    timestamp_imp <= timestamp_req + interval 5 minutes) 
    """), 
  "leftOuter")

rawQuery = rawQuery \
        .writeStream \
        .format("parquet") \
        .option("checkpointLocation", "/home/jovyan/streaming/applicationHistory") \
        .option("path", "/home/jovyan/streaming").start()
print(rawQuery.status)

{'message': 'Processing new data', 'isDataAvailable': True, 'isTriggerActive': True} ERROR:root:Exception while sending command. Traceback (most recent call last): File "/opt/conda/lib/python3.6/site-packages/py4j/java_gateway.py", line 1062, in send_command raise Py4JNetworkError("Answer from Java side is empty") py4j.protocol.Py4JNetworkError: Answer from Java side is empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/opt/conda/lib/python3.6/site-packages/py4j/java_gateway.py", line 908, in send_command response = connection.send_command(command) File "/opt/conda/lib/python3.6/site-packages/py4j/java_gateway.py", line 1067, in send_command "Error while receiving", e, proto.ERROR_ON_RECEIVE) py4j.protocol.Py4JNetworkError: Error while receiving ERROR:py4j.java_gateway:An error occurred while trying to connect to the Java server (127.0.0.1:33968) Traceback (most recent call last):
File "/opt/conda/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2910, in run_code exec(code_obj, self.user_global_ns, self.user_ns) File "", line 3, in print(rawQuery.status) File "/opt/conda/lib/python3.6/site-packages/pyspark/sql/streaming.py", line 114, in status return json.loads(self._jsq.status().json()) File "/opt/conda/lib/python3.6/site-packages/py4j/java_gateway.py", line 1160, in call answer, self.gateway_client, self.target_id, self.name) File "/opt/conda/lib/python3.6/site-packages/pyspark/sql/utils.py", line 63, in deco return f(*a, **kw) File "/opt/conda/lib/python3.6/site-packages/py4j/protocol.py", line 328, in get_return_value format(target_id, ".", name)) py4j.protocol.Py4JError: An error occurred while calling o92.status

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/opt/conda/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 1828, in showtraceback stb = value._render_traceback_() AttributeError: 'Py4JError' object has no attribute '_render_traceback_'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/opt/conda/lib/python3.6/site-packages/py4j/java_gateway.py", line 852, in _get_connection connection = self.deque.pop() IndexError: pop from an empty deque

I'm running Spark in local using Jupyter Notebook. In the spark/conf/spark-defaults.conf I have:

# Example:
# spark.master                     spark://master:7077
# spark.eventLog.enabled           true
# spark.eventLog.dir               hdfs://namenode:8021/directory
# spark.serializer                 org.apache.spark.serializer.KryoSerializer
spark.driver.memory             15g
# spark.executor.extraJavaOptions  -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"

If I try to use Spark after the previous error, I received that error:

ERROR:root:Exception while sending command. Traceback (most recent call last): File "/opt/conda/lib/python3.6/site-packages/py4j/java_gateway.py", line 1062, in send_command raise Py4JNetworkError("Answer from Java side is empty") py4j.protocol.Py4JNetworkError: Answer from Java side is empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/opt/conda/lib/python3.6/site-packages/py4j/java_gateway.py", line 908, in send_command response = connection.send_command(command) File "/opt/conda/lib/python3.6/site-packages/py4j/java_gateway.py", line 1067, in send_command "Error while receiving", e, proto.ERROR_ON_RECEIVE) py4j.protocol.Py4JNetworkError: Error while receiving

1 Answer 1

2

I resolved the problem! Basically, the problem was related to Jupyter Notebook for some reason. I removed the next line of the previous code:

os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.0 pyspark-shell'

And I ran the code using the console:

> spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.0 spark_structured.py

In that way, I could run all the code without problems.

In case of you have the same problem, you can also change the spark-default.conf and increase the spark.driver.memory and spark.executor.memory

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.