Questions tagged [scala]

48981 questions
1

votes
1

answer
303

Views

Scala Class body or primary constructor body

It's a very basic question. Out of curiosity, I wanted to know whether in the below code: class A(str: String) { //body here... } is it Scala Class body or primary constructor body or instance initializer body (like Java)?
boseAbhishek
1

votes
1

answer
3.1k

Views

Error illegal start of simple expression

I am having this error : error: illegal start of simple expression def process_alcs(lines: List[String]) : List[(String, Double)] = for (line
Ruben Lejzerowicz
1

votes
1

answer
580

Views

How can I exclude all transitive dependencies of a library in SBT?

Practically I just want to port this Gradle code to SBT: compile('group:name:version') { transitive = false } How could I achieve that?
Nicofisi
1

votes
1

answer
518

Views

How to install sbt ensime plugin globally?

I've created ~/.sbt/1.0/plugins/plugins.sbt After that cd into ~/.sbt/1.0/plugins I've run addSbtPlugin('org.ensime' % 'sbt-ensime' % '2.5.1') But I had this error zsh: unknown sort specifier Any Ideas?
Dimitar Nonov
1

votes
1

answer
684

Views

Consuming RESTful API and converting to Dataframe in Apache Spark

I am trying to convert output of url directly from RESTful api to Dataframe conversion in following way: package trials import org.apache.spark.sql.SparkSession import org.json4s.jackson.JsonMethods.parse import scala.io.Source.fromURL object DEF { implicit val formats = org.json4s.DefaultFormats ca...
Utkarsh Saraf
1

votes
3

answer
4k

Views

How to create an empty dataFrame in Spark

I have a set of Avro based hive tables and I need to read data from them. As Spark-SQL uses hive serdes to read the data from HDFS, it is much slower than reading HDFS directly. So I have used data bricks Spark-Avro jar to read the Avro files from underlying HDFS dir. Everything works fine except w...
Vinay Kumar
-1

votes
0

answer
11

Views

What are the technologies I have to hire to make a AI based Camera APP for a Desktop?

(Note : if the question belong to different stackexhange notify me or mark for migration.Don't downvote ) Our decided to develop an application with below features The aim of the application is to dress a person using previously uploaded clothing materials who is standing before a display with camer...
Lord Commander
0

votes
0

answer
4

Views

TapeEquilibrium ScalaCheck

I have been trying to code some scalacheck property to verify the Codility TapeEquilibrium problem. For those who do not know the problem, see the following link: https://app.codility.com/programmers/lessons/3-time_complexity/tape_equilibrium/. I coded the following yet incomplete code. test('Lesso...
Alessandroempire
0

votes
0

answer
4

Views

Spark-Scala Log Rotation Issue, Unable to create External Log

Spark-Scala Log Rotation Issue, Unable to create External Log : Unable to create log rotation using RollingFileAppender & ConsoleAppender: i have received below WARN during execution, unable to create any log file in following location. Still using default log file. ++++++++++++++++++++++++++++++++...
Srini K
1

votes
3

answer
98

Views

Deeplearning4j Sharing Computational Graph between Threads in Scala

I'm trying to do image classification. I'm using Scala, the Akka actor system, and deeplearning4j. The thing is that I have to detect always on the same spots or crop on the image. I was thinking of creating a new actor for each crop of the image, on each frame. The thing is that, from what I unders...
Tomas Piaggio
1

votes
2

answer
287

Views

SCALA: Function for Square root of BigInt

I searched internet for a function to find exact square root of BigInt using scala programming language. I didn't get one, But saw one Java Program and I converted that function into Scala version. It is working but I am not sure, whether it can handle very large BigInt. But it returns BigInt only....
RAGHHURAAMM
1

votes
1

answer
1.8k

Views

Parse JSON Array in Scala

I have this jsArray (json Array) and I am using import play.api.libs.json._ library. [{”device”:”Samsung S8”,”android”:true}, {”device”:”iPhone 8”,”android”:false}, {”device”:”MacBook Air Pro”,”android”:false}, {”device”:”Dell XPS”,”android”:false}] I...
1

votes
2

answer
698

Views

Spark collect_list and limit resulting list

I have a dataframe of the following format: name merged key1 (internalKey1, value1) key1 (internalKey2, value2) ... key2 (internalKey3, value3) ... What I want to do is group the dataframe by the name, collect the list and limit the size of the list. This is how i group by the name...
pirox22
0

votes
0

answer
4

Views

Is there update operator support on MongoDB connector for Spark?

Using: Spark / MongoDB connector for Spark / Scala Hi Is there update operators support on MongoDB connector for Spark? I'd like to use update docuemnt query(updateOne query) with $inc, $currentDate, $setOnInsert update operators and update option(like upsert = true). I already did with mongo-scala...
gingermanjh
-1

votes
3

answer
23

Views

Spark - Map flat dataframe to a configurable nested json schema

I have a flat dataframe with 5-6 columns. I want to nest them and convert it into a nested dataframe so that I can then write it to parquet format. However, I don't want to use case classes as I am trying to keep the code as configurable as possible. I'm stuck with this part and need some help. My...
devnong
1

votes
1

answer
51

Views

assert in foreach causes too few argument lists for macro invocation

I'm using scala test to check if Array contains Arrays of given size: result.map(_.length == 2).foreach(assert) This causes compilation error: Error:(34, 39) too few argument lists for macro invocation result.map(_.length == 2).foreach(assert) although intellij does not indicate compilation error. H...
Andronicus
1

votes
2

answer
64

Views

How can I get the path to the current target directory in my build.sbt

In my build.sbt I want to know the current target file. Something like this: val targetFile = ??? // /home/fbaierl/Repos/kcc/scala/com.github.fbaierl/target/scala-2.12/myapplication_2.12-1.2.3-SNAPSHOT.jar With target.value I only get the directory up until /target. Is there any way to get the full...
Florian Baierl
1

votes
1

answer
20

Views

How to find scala documentation api from interpreter

In Python we can do help(function) or help(class) to find the documentation of a particular function or class How can we do that in Scala from the scala interpreter ?
mctrjalloh
1

votes
3

answer
67

Views

Scala function composition totalFn(partialFn(totalFn(x)))

I was trying to compose three functions with only the middle one being a PartialFunction. I would expect the resulting type to be PartialFunction as well. Example: val mod10: Int => Int = _ % 10 val inverse: PartialFunction[Int, Double] = { case n if n != 0 => 1.0 / n } val triple: Double => Double...
norbertk
1

votes
1

answer
29

Views

Cannot run a scala script (fpmax by jdegoes)

Here's a brilliant web talk by J. A. De Goes: https://www.youtube.com/watch?v=sxudIMiOo68 - highly recommended for everyone interested in functional programming And here's accompanying code gist: https://gist.github.com/jdegoes/1b43f43e2d1e845201de853815ab3cb9 When I run $ scalac fpmax.scala, it com...
Vasily802
1

votes
3

answer
74

Views

Executing N times a Scala Future

I am trying to find a more elegant way to execute 2 times a function which returns a Future[HttpReponse] and then to use the response of the 2-end call. for { // function post returns a Future[HttpResponse] response
Martin
1

votes
1

answer
27

Views

How to properly apply HashPartitioner before a join in Spark?

To reduce shuffling during the joining of two RDDs, I decided to partition them using HashPartitioner first. Here is how I do it. Am I doing it correctly, or is there a better way to do this? val rddA = ... val rddB = ... val numOfPartitions = rddA.getNumPartitions val rddApartitioned = rddA.partiti...
MetallicPriest
1

votes
2

answer
57

Views

how to match Some with any value in mockito

My function under test returns None or Some(ObjectOfSignupEmail). In my test case, I want to match that the returned value is Some(ArgumentMatchers.any[SignupEmail]) but I get error Expected :Some(null) Actual :Some(SignupEmail(Welcome,Test,Click here to verify email)) If I change the code to sig...
Manu Chadha
1

votes
1

answer
48

Views

typeclass annotation should have been removed by simulacrum but was not

I am trying to write a simple typeclass using Simulacrum. Here is my build.sbt ThisBuild / scalaVersion := '2.12.8' ThisBuild / version := '0.1.0-SNAPSHOT' ThisBuild / organization := 'com.example' ThisBuild / organizationName := 'example' lazy val root = (project in file('.')) .set...
Knows Not Much
1

votes
5

answer
57

Views

how to extract part of string that did not match pattern

I want to extract part of string that did not match pattern My pattern matching condition is sting should be of length 5 and should contain only N or Y. Ex: NYYYY => valid NY => Invalid , length is invalid NYYSY => Invalid. character at position 3 is invalid If string is invalid then I want to f...
Niketa
1

votes
1

answer
47

Views

Design a generic trait in Scala

I'm learning data structure recently. There is a case that I want to design a generic trait which type should support comparable. If I need to design a generic class, I can design like the following: class SortedType [A: Ordering](val x: A) val x = new SortedType(3) val y = new SortedType('Hello, Wo...
xiaojia zhang
1

votes
2

answer
59

Views

Sum columns of a Spark dataframe and create another dataframe

I have a dataframe like below - I am trying to create another dataframe from this which has 2 columns - the column name and the sum of values in each column like this - So far, I've tried this (in Spark 2.2.0) but throws a stack trace - val get_count: (String => Long) = (c: String) => { df.groupB...
van_d39
1

votes
2

answer
43

Views

case when otherwise dataframe spark

I wrote this : val result = df.withColumn('Ind', when($'color' === 'Green', 1).otherwise(0)) And I want to extend the condition $'color' === 'Green' to $'color' in ['GREEN', 'RED', 'YELLOW'] Any idea how to do this please ?
scalacode
1

votes
1

answer
58

Views

How spark loads the data into memory

I have total confusion in the spark execution process. I have referred may articles and tutorials, nobody is discussing in detailed. I might be wrongly understanding spark. Please correct me. I have my file of 40GB distributed across 4 nodes (10GB each node) of the 10 node cluster. When I say spark....
Learner
1

votes
1

answer
173

Views

How to add implicit method to string

How to write implicit method to string to log directly. Below Code works Fine : case class Worker(name: String) extends Actor with ActorLogging { override def receive: Receive = { case string: String => log.info('string received') case _ => log.info('unknown message received') } } Want...
Chandan Kumar
1

votes
1

answer
49

Views

Stackable trait/decorator and abstract class

I have an abstract class (Java library) that takes constructor arguments and has a method called execute that I want to decorate: public abstract class Task { private final String name; protected Task(String name) { this.name = name; } public abstract void execute(String str) throws Exception } And...
Alexey Sirenko
1

votes
1

answer
40

Views

Spark SQL lazy count

I need to use a dataframe count as divisor for calculating percentages. This is what I'm doing: scala> val df = Seq(1,1,1,2,2,3).toDF('value') scala> val overallCount = df.count scala> df.groupBy('value') .agg( count(lit(1)) / overallCount ) But I would like to avoid the action df.count as it will...
Pedro H
1

votes
2

answer
36

Views

What are these symbols that crash URLDecoder with UTF-8?

I'm using URLDecoder to decode a string: import java.net.URLDecoder; URLDecoder.decode('%u6EDA%u52A8%u8F74%u627F', StandardCharsets.UTF_8.name()); Which leads to the crash Exception in thread 'main' java.lang.IllegalArgumentException: URLDecoder: Illegal hex characters in escape (%) pattern - For in...
Sahand
1

votes
3

answer
68

Views

How to override equals for alias types in scala

I am using a type alias like below: trait IPAddress object IPAddr { type IP = Array[Byte] with IPAddress type IPv4
Sayantan Ghosh
1

votes
1

answer
60

Views

Unable to understand Scala's type inference

I was going through Learning Concurrency With Scala It had a following piece of Code. package week_parallel.week1.SC_Book import scala.collection.mutable object SyncPoolArgs extends App { private val tasks = mutable.Queue[() => Unit]() object Worker extends Thread { setDaemon(true) def poll() = task...
Ishan Bhatt
1

votes
2

answer
56

Views

Scala: filtering list of tuples to get a nonEmptyList

I have a list of the type List[(A, List[B])]. I want to flatten this structure and get: NonEmptyList[A] consisting of all the Acorresponding to a nonEmpty List[B]. all those Bs combined: NonEmptyList[B] ie, I want to get Option[(NonEmptyList[A], NonEmptyList[B])]. What is the most concise way of doi...
Maths noob
1

votes
1

answer
26

Views

Cannot created object walking though JSON string using Circe JSON parser

So i am trying to create a custom decoder for a JSON string to be converted into a domain object. I am using Scala/Circe to walk through the JSON and create the object. I am unable to get this to run. I am sure i am not clear on how to walk through the JSON. Can someone advise please ? This is the...
Som Bhattacharyya
1

votes
1

answer
32

Views

Scala Spark: How to pad a sublist inside a dataframe with extra values?

Say I have a dataframe, originalDF, which looks like this +--------+--------------+ |data_id |data_list | +--------+--------------+ | 3| [a, b, d] | | 2|[c, a, b, e] | | 1| [g] | +--------+--------------+ And I have another dataframe, extraInfoDF, which looks like...
JR3652
1

votes
1

answer
32

Views

Optimization cats library “<*>”

I am trying to optimize imports in my code and eliminate all '_' as import cats.implicits._ I have already optimized all my imports except the cats simbol ''. I tryed with: import cats.Apply.Ops import cats.Apply.{A,lot,of,options} import cats.implicits.{A,lot,of,options} import cats.syntax.ApplyOps...
1

votes
3

answer
50

Views

Can I avoid mutating in this scenario?

I am trying iterate an list and then doing an range check on each item and cumalating scores accordingly. Pretty straight forward. I feel like i am doing this in more traditional way and creating lot of 'var' variables.. Is there an effective functional/immutable way of achieving this behavior? var...

View additional questions