112,840 questions
0
votes
0
answers
56
views
trying to read bigquery array colum and passing it as columns to fetch from spark dataframe
I have a bigquery table with array column named as "column_list "
ALTER TABLE `test-project.TEST_DATASET.TEST_TABLE`
ADD COLUMN column_list ARRAY<STRING>;
update `test-project....
0
votes
1
answer
74
views
col function error type mismatch: found string required Int
I am attempting to programmatically remove specific columns/fields from a dataframe (anything that starts with _), whether the field is in the root or in a struct, using the dropFields method.
For ...
1
vote
0
answers
82
views
Lucene Boolean Operator Problems
Using Lucene, certain queries parse and execute in a completely unexpected way.
Here's the code for testing it (written in Scala, but can be easily translated to Java too):
import org.apache.lucene....
0
votes
1
answer
62
views
How to reference a CSV column with parentheses and a decimal point in Spark SQL or COALESCE expression?
I’m working on a data ingestion pipeline using Apache Spark (triggered via a Cloud Function on Dataproc).
The input CSV contains column names that include special characters such as parentheses and a ...
1
vote
0
answers
63
views
How to run sbt offline (downloading all the dependecies and moving)
I wanted to use riscv-torture project to create tests in a server that has no access to the internet. I firstly run it in my ubuntu 22.04 then copy .sbt .ivy2 .cache/coursier .cache/JPN folders to the ...
2
votes
1
answer
74
views
Determine the variance of a class/type parameter by reflection without using `TypeTag`
I can get the formal type parameters (type variables) of any Java class/interface, including those defined in Scala, through java.lang.reflect API.
In addition, Scala allows to define type parameters ...
-1
votes
2
answers
131
views
When to use conditionals vs pattern matching, is there a performance trade off? [closed]
I have two possibilities to implement a simple list drop method.
A:
def drop[A](l: List[A], n: Int): List[A] = (n, l) match
case (n, l) if n <= 0 => l
case (_, Cons(_, t)) => drop(t, n - ...
1
vote
0
answers
77
views
scala-maven-plugin cannot compile Java15+ feature text block
mixed compilation project of Scala and Java, and now upgrade jdk8->jdk21, use Java15+ feature text block
before:
String sql = "update hinsight.homeco_user_task set \n" +
"...
0
votes
0
answers
78
views
Json matcher stop matching after migration to specs2 4.22
I have a small specification that uses json matcher
import org.specs2.matcher.JsonMatchers
import org.specs2.mutable.Specification
class Test extends Specification with JsonMatchers {
"test&...
1
vote
0
answers
40
views
Get list of settings from another setting in SBT
I'm new to Scala and I'm having trouble wrapping my head around the following scenario - I want to have a centralized spot in build.sbt where we log the current profile and set certain values based on ...
1
vote
3
answers
113
views
How to transform a Right to a Left
I was just reading some Scala code that, due to the structure of the program (and its use of for comprehensions), conditionally transformed a Right into a Left using flatMap:
// someMethod() returns ...
1
vote
0
answers
71
views
In a gradle project that contains both Kotlin and Scala code, what's the easiest way to make Kotlin compilation to depend on Scala bytecode?
In gradle, both Kotlin and Scala compiler plugins have limitations: they can both compile Java code in mixed-order, but they cannot compile each other. The only option is to:
compile Scala code first
...
1
vote
0
answers
73
views
Why is mapAsync internal buffer not getting garbage collected?
I've written following snipped to mimic my problem:
implicit val actorSystem: ActorSystem = ActorSystem()
implicit val executionContext: ExecutionContext = actorSystem.dispatcher
...
0
votes
0
answers
59
views
Scala spark: Why does DataFrame.transform calling a transform hang?
I have a job on scala (v. 2.12.15) spark (v. 3.5.1) that works correctly and looks something like this:
import org.apache.spark.sql.DataFrame
...
val myDataFrame = myReadDataFunction(...)
....
1
vote
0
answers
72
views
Quill-Cassandra. Could not reach any contact point
I'm trying to set up a connection to Cassandra using Quill
but I get an error:
08:11:14.891 [main] ERROR trails.TrailsModule - Failed to create Cassandra context
com.datastax.oss.driver.api.core....
2
votes
3
answers
69
views
Is there any way to shorthand multiple parameterised types
Say I have the following trait and classes:
trait T[C1, C2, C3, C4]
case class A() extends T[Int, Double, Long, String]
case class B() extends T[Int, Double, Long, String]
case class C() extends T[...
1
vote
3
answers
96
views
How to pass array of structure as parameter to udf in spark 4
does anybody know what am I doing wrong? Following is reduced code snippet working in spark-3.x but doesn't work in spark-4.x.
In my usecase I need to pass complex data structure to udf (let's say ...
0
votes
0
answers
53
views
Scala Akka actors - Message to Actor was not delivered
I have an application built using Akka actor framework which reads messages from a kafka topic and processes them. My code is structured in the following way:
I have a KafkaConsumerActor that reads ...
1
vote
2
answers
68
views
How to use scalacheck to produce arbitrary values with an Numeric context bound
Using ScalaChecks arbitrary api and generators, how can I produce arbitrary numbers that have an context bound of Numeric[ A ], with A being an AnyVal? It matters not if the result is an Arbitrary or ...
3
votes
0
answers
54
views
Why does case a & Nothing make this recursive Scala 3 match type compile?
In Scala 3 (3.3.x) I want a recursive match type over intersections:
sealed trait Authorization
trait Admin extends Authorization
trait Owner extends Authorization
trait Authorizer[A <: ...
1
vote
1
answer
132
views
With hibernate 7, how to map a java duration to a postgres interval in a typed query?
I'm migrating from hibernate 5 to 7. I use the lib io.hypersistence::hypersistence-utils-hibernate-70:3.10.1 (previously com.vladmihalcea::hibernate-types-52:2.19.1 )
I mapped java.io.duration to ...
0
votes
1
answer
84
views
java vertx webclient stream from server with chunked messages
I have a vertx server which sends on events to all streamers an information.
Using a console all working as wished.
But if I use a vertx webclient the client is connected but did not receive anything.
...
0
votes
0
answers
41
views
How to enable kerning/ligatures (LayoutProcessor) with TTF loaded from Cloud Storage?
Library: OpenPDF 1.4.2 (com.lowagie.text.pdf)
Goal: Enable kerning and ligatures for TrueType/OTF fonts using LayoutProcessor (enableKernLiga) when rendering text via ColumnText/Paragraph.
Constraint: ...
0
votes
0
answers
62
views
How to suppress deprecated warnings in code generated for type classes?
In Scala 3 macros how to suppress deprecated warnings in code generated for type classes of types defined like here:
@deprecated("some reason") case class C(x: Int) derives SomeTypeClass
...
1
vote
0
answers
47
views
How do I peek a Bundle in ChiselSim?
I'm trying to test a circuit with ChiselSim. The input to the circuit is a Bundle (without nesting) which I can poke without any problem, but when I try to peek the output of the same type I get an ...
0
votes
1
answer
93
views
Combining a sequence of Either function results using a for comprehension [duplicate]
Using a for comprehension, I can succinctly make a series of interdependent calls to one or more functions that return an Either, short circuiting when any returns a Left.
I can use this approach to ...
0
votes
0
answers
138
views
Decode a sealed trait with type as String
sealed trait UserEvent extends Event {
def entityId: String
def instant: ZonedDateTime
}
case class Created(username: String,
password: String,
email: String,...
0
votes
1
answer
96
views
Case insensitive decoder of Enum
I have the below case-class Person, that has an enum.
object Gender extends Enumeration {
type Gender = Value
val Male, Female, Unknown = Value
implicit val genderDecoder: Decoder[Gender.Value] ...
1
vote
1
answer
94
views
How do you pass any type of object from the CtxOut in a HandlerAspect down to a Handler?
This question is adjacent to this question but not completely as my zio-http implementation is a bit more involved, making this quite a bit more tricky.
The question is the same: how do I pass any ...
0
votes
1
answer
68
views
Apache Spark Error: Cannot cast STRING into a StructType(StructField(subFieldA,StringType,true)) (value: BsonString{value='{}'})
I’m reading documents from DocDB (MongoDB) into Spark using the mongo-spark-connector.
One of the fields, fieldA, is a nested object. If fieldA is missing in a document, I replace it with an empty ...
0
votes
2
answers
105
views
Scala Spark decompress lz4 csv
Is there any way to decompress csv lz4 files by Spark?
I tried following approaches:
First:
sparkSession.read
.option("delimiter", ",")
.option("compression", "...
0
votes
2
answers
96
views
Read files from storage account in Spark - Without using keys - Azure
I am doing local development. Wish to run spark job locally on my desktop, and access files in storage account from my spark job.
I don't have an option to use SAS tokens or access-keys for my storage ...
0
votes
1
answer
58
views
bigtable spark connector not reading cell timestamp with the data
I am using bigtable spark connector to read bigtable data in scala code. I want to read the cell timestamp with the data as well. But nowhere I can find how to do it. Can someone help me on this?
I ...
0
votes
1
answer
52
views
Global Temp view shows up empty when passed from Pyspark to Scala notebook in Databricks
I have a Azure Databricks workflow which runs a Pyspark notebook which in turns calls a Scala notebook(legacy) for a list of tables. In Pyspark notebook, I save a Dataframe to a GlobalTempView and ...
0
votes
1
answer
88
views
How to fail Spark on wrong schema?
The data:
{"name": "name1", "id": 1}
{"name": "name2", "id": "1"}
The code:
val schema =
"""
| name ...
2
votes
0
answers
70
views
SyntacticRule with optional Semantic in Scalafix?
I am working on Rules to switch from Kyo 0.19 to Kyo 1.0, is there a way to make those rules Syntactic, with some optional Symbol checks if it's possible (semanticDb is available)?
package kyo.rules
...
0
votes
0
answers
107
views
Google Sign-In fails silently with /login?error when using HTTPS (works on HTTP) in Spring Boot + Scala application
I am facing an issue with the Google Sign-In integration in my Scala + Spring Boot web application.
Problem:
When I try to sign in with Google using HTTPS (deployed via ngrok), I am silently ...
0
votes
2
answers
87
views
sbt plugin resolvers not adding sbtVersion, scalaVersion to artifactory url
In an sbt project, I'm adding some plugins, like
addSbtPlugin("org.typelevel" % "sbt-tpolecat" % "0.5.2")
This should resolve to an artifact url similar to this:
https://...
0
votes
1
answer
51
views
Generic way of array creation in Scala.js
I'm looking for a generic way of array creation in Scala.js.
The following code works fine in JVM and Scala Native platforms:
def newArray[A](fqcn: String, size: Int): Array[A] = {
val clazz = Class....
1
vote
1
answer
72
views
Referring to non-existent class while compiling to ScalaJS
I'm builing a ScalaJs & Scala3 application, when i run the npm run dev/build command I have the following error, and I'm not figuring out how to fix it:
Referring to non-existent class scala....
0
votes
2
answers
80
views
conditional map fetch efficiently
I have a map which lays down prices of commodities in different currencies
val commpricemap: Map[String , Map[String, Double]] = ???
AN example of an entry for gold is as below:
("AU" -> ...
0
votes
0
answers
34
views
mill debugger intellij in Xiangshan project
I'm trying to debug the XiangShan RISC-V SoC project using IntelliJ IDEA with Mill and BSP.
I've followed all recommended steps (scala, the mill build tool, and the IntelliJ debugger) but breakpoints ...
0
votes
2
answers
105
views
Spark Unit test failing maven test but pass in IntelliJ
I'm working on a Scala project using Spark (with Hive support in some tests) and running unit and integration tests via both IntelliJ and Maven Surefire.
I have a shared test session setup like this:
...
0
votes
0
answers
55
views
Newest Java 24 can't run Play because of class file version mismatch [duplicate]
I've created a project that ran perfectly fine 2 weeks ago with Scala and Play Framework. It worked, but now it doesn't, but the error confuses me a lot. I used Java 24 and also tested on Java 17, but ...
0
votes
1
answer
84
views
Transform a match to avoid duplicate
Is there a construct which allows to drop the repetition of the 0.0 routine below
val price: Option[Double] = ???
price match {
case Some(d) =>
if (isPriceFair(d))
d
else
...
-1
votes
1
answer
99
views
How do I filter data based on text expressions? [closed]
Now I can read data from a database and can map the data to a case class using a ORM the case class is like following
case class Example(a: Double, b: Double)
And I have a text file and this file ...
1
vote
1
answer
113
views
Avoiding redundant else
class Person( id:String , age:Option[Double])
val rows: Seq[Person] = List(Person("a",None),Person("c",None),Person("e",50))
val ages = rows.foreach( r => r.age ...
0
votes
1
answer
73
views
Scala Scopt does not return non zero error code on failure
When invalid number of arguments are passed while calling application jar(spark-submit in this case) Scala Scopt does not return non zero error code. So although the scopt failed the scheduler shows ...
2
votes
0
answers
93
views
How to create `lazy val` with parametized name and value in Scala 3 macros?
It is so easy in Scala 2, just:
q"lazy val $name = $value"
I need it for local values, so I don't have multi-thread requirements
I've tried just adding Flags.Lazy to the val definition ...
0
votes
0
answers
65
views
Unable to load org.apache.spark.sql.delta classes from JVM pyspark
I’m working on Databricks with a cluster running Runtime 16.4, which includes Spark 3.5.2 and Scala 2.12.
For a specific need, I want to implement my own custom way of writing to Delta tables by ...