Questions tagged [scala]

39100 questions
1

votes
1

answer
919

Views

Classpath issue when binding an instance in a Scala Interpreter

I already posted this question in scala lang forum but unfortunately I did not get any answer. Second chance ? I try to embed an interpreter an evaluate a scala snippet into this interpreter. I would like to bind an instance of a custom class within the interpreter. To sum up that would look like: i...
pascal
1

votes
1

answer
5.2k

Views

sbt stage : not a valid command

I am trying to follow the tutorial for scala and spray with this great template : http://typesafe.com/activator/template/activator-akka-spray Then I follow instruction for Heroku deployment : https://devcenter.heroku.com/articles/getting-started-with-scala I encounter a problem when running the comm...
lightmania
0

votes
0

answer
6

Views

How to load CSV file in apache spark SCALA

How to load CSV file I have two CSV file users and tweets Where in tweets table I split the record by ',' But In tweet field between two quotes there are so many commas so I'm not getting proper output or primer data So what is the correct code in Scala?
JD Patil
1

votes
1

answer
469

Views

Getting exp to work in spire

I'me learning scala for the purpose of scientific programing. I'm trying to write some simple generic code using spire. I've got the following example working: import spire.algebra._ import spire.math._ import spire.implicits._ object TestSqrt { def testSqrt[T: Numeric](x: T) = { sqrt(x) } def mai...
jcrudy
1

votes
1

answer
950

Views

Play framework only Error: object httpclient is not a member of package org.apache.commons

I have two import in scala on play framework. However play keep telling "me object httpclient is not a member of package org.apache.commons". I did add commons-httplclient.jar as external jar in .classpath(in eclipse). I created a console application in eclipse to verify and there is no error at all...
user2512057
1

votes
1

answer
494

Views

how to convert map into repeated tuple in scala

I have a method that accepts repeated tuples myfn(attributes: (String, Any)*) can I convert a map to a repeated tuple ? for example val m = Map ("a1"->1,"a2"->8) // convert to tuple myfn(m1)
CruncherBigData
0

votes
1

answer
16

Views

Scala Nested expression does not take parameters

Team, New to scala and learning step by step. while learning nested scopes in expression blocks, wrote below lines of code object ExpressionTest extends App { val area = { val PI = 3.14 PI * 10 { val PI= 100 PI * 2 } } println(area) } Getting below exception at runtime . Error:(9, 5) Int(10) does n...
Learn Hadoop
-1

votes
1

answer
20

Views

How to change value of immutable object

val factor = 3 val multiplier = (i:Int) => i * factor if I call multiplier(3) it will give us an output 9. What I want is multiplier(3) output =9 multiplier(3) output=6 How to change multiplier for this type of functionality as factor is immutable?
Shubham Goyal
0

votes
0

answer
5

Views

spark select columns by type

I want to have a function to dynamically select spark Dataframe columns by their datatype. So far, I have created: object StructTypeHelpers { def selectColumnsByType[T
Georg Heiler
1

votes
2

answer
108

Views

Difficulty understanding this type signature

merge sort type signature : def msort[T](less: (T, T) => Boolean)(xs: List[T]): List[T] = { The function is called using : msort[Int]((a, b) => a < b) _ Does the type msort[Int] type the parameters a & b to Int ? To better understand this type signature I've tried to extract the less function : d...
blue-sky
1

votes
2

answer
850

Views

Finding the nth element of a list in scala recursively with pattern matching

I have a problem here from 99 scala problems (http://aperiodic.net/phil/scala/s-99/p03.scala) that I am trying to figure out how it works. I'm fairly new to Scala. I was able to complete this challenge with a similar solution using pattern matching and recursion however mine did not consider the lis...
user3789100
1

votes
1

answer
56

Views

Why is this invalid Scala?

I'm working with abstract types, and I'm wondering why this is invalid: class A {} class B extends A {} class X {type T = A} class Y extends X {override type T = B} Seeing as B
mdenton8
0

votes
0

answer
5

Views

scala spark, how do I merge a set of columns to a single one on a dataframe?

I'm looking for a way to do this without a UDF, I am wondering if its possible. Lets say I have a DF as follows: Buyer_name Buyer_state CoBuyer_name CoBuyers_state Price Date Bob CA Joe CA 20 010119 Stacy IL Jamie IL...
NutellaAddict
1

votes
1

answer
568

Views

Error handling in for comprehension

I have some list to browse but i whant to know if there is some touble in the browsing exemple : val feedChildrens = for { persone "Go purshace more cake !" } but with this system we could not have precise error like "Error this is a gost city" and "Error there is children in this city". I know i...
crak
1

votes
1

answer
912

Views

best way to test Slick session/transaction code

for creating datasource I have object MyDataSource { priavte lazy val dataSource: javax.sql.DataSource = { val ds = new BasicDataSource val conf = ConfigFactory.load() val url = conf.getString("jdbc-url") val driver = conf.getString("jdbc-driver") val username = conf.getString("db-username") val pa...
user2066049
1

votes
1

answer
4k

Views

Is it possible to import custom java classes into Scala?

Am trying to learn Scala. With whatever i could look into so far, i see it mentioned that Scala is pretty compatible with Java. "You can call the methods of either language from methods in the other one". But i just couldn't find a suitable example of a custom Java class being imported into a Scala...
5122014009
1

votes
1

answer
1.8k

Views

Working with DateTime in Play 2.3.7 framework

I am looking for a clean solution to work with datetime in Play framework. Specifically I need to persist datetime in a database, convert that datetime to a Scala object, and convert that datetime to and from Json. For example I am making a bidding service like eBay, so I want to persist the time an...
Khanetor
1

votes
1

answer
311

Views

is client thread-safe in Twitter Finagle

Is the the client in Twitter Finagle thread-safe ? I don't want to call newClient in each RPC function (I think it will make a new connection to destination end, is it?) So it seems that re-using the same client object is a good choice. But, because the RPC functions may be called by diff threads a...
hl1020
0

votes
0

answer
18

Views

How is a typed Scala object losing its type?

In the following piece of code, entities is a Map[String, Seq[String]] object that I receive from some other piece of code. The goal is to map the entities object into a two column Spark DataFrame; but, before I get there, I found some very unusual results. val data: Map[String, Seq[String]] = Map("...
kingledion
1

votes
2

answer
686

Views

Packaging src/test into a jar in a sbt project

In a sbt project, The standard source locations for testing are: Scala sources in src/test/scala/ Java sources in src/test/java/ Those test files are not packaged when I run sbt package I want to copy the packaged jar to a remote machine to run the test. Is there a way to let sbt include test resour...
Yifei
1

votes
2

answer
1.2k

Views

how to use mocking in unit testing in scala

hi i am new in Unit Testing and i want to test it using mock objects .i want to test whether data is successfully stored in mongoDB or not here is my code package models.RegularUserModels import models.UserModels.UserStatus._ // User will give information to Signup class DirectUser() extends Reg...
swaheed
1

votes
1

answer
2k

Views

SBT/Scala: macro implementation not found

I tried my hand on macros, and I keep running into the error macro implementation not found: W [error] (the most common reason for that is that you cannot use macro implementations in the same compilation run that defines them) I believe I've set up a two pass compilation with the macro implementati...
Wonko
1

votes
1

answer
592

Views

Can we define nested projects in a SBT project?

We can define multi projects in a SBT project, like: lazy val core = project in file("core") lazy val web = project in file("web") lazy val shared = project in file("shared") But is it possible to define nested projects inside a sub project? Like: lazy val nested = project in file("nested") lazy val...
Freewind
0

votes
1

answer
21

Views

Unit-testing with cats-effect's IO monad

The Scenario In an application I am currently writing I am using cats-effect's IO monad in an IOApp. If started with a command line argument 'debug', I am delegeting my program flow into a debug loop that waits for user input and executes all kinds of debugging-relevant methods. As soon as the devel...
Florian Baierl
1

votes
0

answer
19

Views

getting dead letter from Actor when trying to get back the response from it

I have ParentActor and 2 ChildActors here is my code Class ParentActor extends Actor { val mongoActor = context.of..... val esActor = context.of ............ def receive { case InserInMongo(obj) => val mFuture = ask(mongoActor, InsertDataInMongo(object)).mapTo[Boolean] mFuture.onComplete { case Su...
swaheed
1

votes
1

answer
371

Views

How to access fields of case class from JSP?

When I was a Java programmer my typical approach was to write a POJO with getters and access its fields in JSP through those getters ${pojo.field}. Now I'm trying to use Scala. Case classes looks like a good replacement for POJO's, but scala provides field() getters instead of getField() which is re...
Jofsey
1

votes
2

answer
1.4k

Views

Spark Streaming MQTT

I've been using spark to stream data from kafka and it's pretty easy. I thought using the MQTT utils would also be easy, but it is not for some reason. I'm trying to execute the following piece of code. val sparkConf = new SparkConf(true).setAppName("amqStream").setMaster("local") val ssc = new Stre...
Thiago Pereira
1

votes
3

answer
436

Views

json4s parse json partially

I have a json model, where contents of certain attribute depend on the other attribute. Something like this: "paymentMethod": "CREDIT_CARD", "metaData": { "cardType": "VISA", "panPrefix": "", "panSuffix": "", "cardHolder": "", "expiryDate": "" } So when paymentMethod equals to CREDIT_CARD, the metad...
Haspemulator
1

votes
1

answer
2k

Views

How to suppress Warning by sbt compile

I am currently working on project, my project works fine and working. But i want to suppress the warning when i run sbt compile. when i run sbt compile i don't want my terminal to show warning. [warn] /Users/kumarshubham/Documents/repositories/alice/app/misc/QueryDB.scala:14: imported `QueryString'...
Hackaholic
1

votes
2

answer
1.9k

Views

Filter columns in large dataset with Spark

I have a dataset which is 1,000,000 rows by about 390,000 columns. The fields are all binary, either 0 or 1. The data is very sparse. I've been using Spark to process this data. My current task is to filter the data--I only want data in 1000 columns that have been preselected. This is the curren...
npp1993
1

votes
3

answer
140

Views

Scala generic function for dividing a collection by modulo

I want to evenly divide a collection into several groups and here is my code: import scala.collection.mutable.ArrayBuffer object MyCollection { def divideByModulo[A](items: Array[A], div: Int): Array[Array[A]] = { require(items.size >= div) val itemSets = (0 until div) map { i => ArrayBuffer(items(i...
Qian Lin
0

votes
0

answer
5

Views

does scala with spark need gc manually?

I'm using Graphx withing Scala do some calculation. I need drop some vertex from a graph and loop several times. But the code always throw java.lang.StackOverflowError var rawG = GraphLoader.edgeListFile(sc, inputFilePath, edgeStorageLevel = StorageLevel.MEMORY_AND_DISK, vertexStorageLevel = Storage...
macroxmu
1

votes
1

answer
5.2k

Views

how to convert Array[org.apache.spark.sql.Row] to Array[Int]

I'm trying to convert Array[Row] to Array[Int] but in vain. Appreciate some help scala> res17(0).toInt :30: error: value toInt is not a member of org.apache.spark.sql.Row res17(0).toInt ^ scala> res17(0) res28: org.apache.spark.sql.Row = [1,0] scala> res17(0).toArray :30: error: value toArray is not...
PKM15
1

votes
0

answer
11

Views

How to use nested generic types in Scala as method return type

I have a few methods that would like to use a nested generic type as its return type. import scala.language.higherKinds trait DBClient { def execute[T[U]
Aqueel Miqdad
2

votes
2

answer
97

Views

Circular project dependencies with a testkit sbt

I maintain a open source bitcoin library called bitcoin-s. If you look at the build.sbt file you will see that the testkit project depends on the rpc project, and the rpc project depends on the testkit project as a publish dependency inside of our Deps.scala file. This is unfortunate because if we...
Chris Stewart
1

votes
2

answer
335

Views

Accessing Play configuration from Akka Actor

I have an Akka Actor in my Play app that accesses Play's configuration using a now deprecated method. class MyActor (supervisor: ActorRef) extends Actor { val loc = Play.current.configuration.getString("my.location").get def receive = { case _ => } } if I do this: import javax.inject._ class MyActor...
awfulHack
1

votes
2

answer
2.1k

Views

Type mismatch found java.util.List[String]:required List[String]

I am very new to scala. I have started using scala for my spark project. I am using some of java code. following line I am getting error. case class docDisplay( id :String,name :String, session :String, time :String, docguid: scala.collection.immutable.List[String] )...
user2844511
1

votes
1

answer
696

Views

Spark not saving the dataframe as a paraquet file

Trying to save the spark dataframe as a paraquet file.But unable to achieve ,due to the Exception below.Kindly guide me,if I am missing something.The dataframe has been constructed from the kafka stream rdds. dataframe.write.paraquet("/user/space") Exception Stack: Exception in thread "streaming-jo...
Bindumalini KK
1

votes
1

answer
527

Views

Spark Streaming Sliding Window max and min

I am begineer to Spark; I am working on spark streaming use case where I receive a json messages each json message has an attribute 'value' which is double after parsing json I get a Array[Double].I want to find out max(value) and min (value) for last 15 sec with sliding window of 2 sec. Here is my...
nilesh1212
0

votes
0

answer
2

Views

Spark 2.3.1 Scala 2.12.6 : Unable to import sqlContext.implicits._ with Maven

I receive the following error when trying to compile import sqlContext.implicits._ with Maven. error: not found: object spark [ERROR] import spark.implicits._ [ERROR] ^ [ERROR] one error found I found previous posts on this e.g. Spark Scala : Unable to import sqlContext.implicits._ However,...
Christian

View additional questions