Scala Language
Scala Language
#scala
Table of Contents
About 1
Remarks 2
Versions 2
Examples 3
Delayed Initialization 4
Delayed Initialization 4
Scala Quicksheet 6
Chapter 2: Annotations 8
Syntax 8
Parameters 8
Remarks 8
Examples 8
Using an Annotation 8
Remarks 11
Examples 11
Keep it simple 11
Syntax 13
Examples 13
Syntax 18
Examples 18
Singleton Objects 20
Companion Objects 20
Objects 21
Constructors 23
Primary Constructor 23
Auxiliary Constructors 24
Chapter 6: Collections 25
Examples 25
Sort A List 25
Map 28
Filter 28
Traversable types 30
Fold 31
Foreach 32
Reduce 32
Introduction 34
Syntax 34
Remarks 34
Examples 34
Chapter 8: Currying 37
Syntax 37
Examples 37
Currying 38
Currying 38
Examples 42
Introduction 43
Syntax 43
Remarks 43
Examples 43
Field Accesses 43
Method Calls 44
Examples 46
Examples 50
Try 50
Either 50
Option 51
Pattern Matching 51
Using fold 51
Converting to Java 51
Syntax 54
Examples 54
Tuple Extractors 54
Regex Extractors 57
Transformative extractors 57
Syntax 59
Parameters 59
Examples 59
Remarks 63
Examples 63
Anonymous Functions 63
Underscores shorthand 64
Composition 64
Relationship to PartialFunctions 65
Examples 66
Creating a Future 66
Syntax 70
Remarks 70
Examples 70
Type aliases 70
Value classes 70
Remarks 72
Examples 72
Examples 75
Basic If Expressions 75
Syntax 76
Remarks 76
Examples 76
Implicit Conversion 76
Implicit Parameters 77
Implicit Classes 78
Examples 80
Arrays 80
Examples 83
Read JSON 83
Write JSON 83
DSL 83
Custom Format 84
Introduction 91
Syntax 91
Remarks 91
Examples 91
Macro Annotation 91
Method Macros 92
Errors in Macros 93
Examples 95
Monad Definition 95
Examples 97
Examples 99
Built-in Operators 99
Operator Overloading 99
Syntax 102
Examples 102
Basics 103
Introduction 106
Examples 106
Remarks 108
Examples 108
Pitfalls 108
Remarks 110
Examples 110
Examples 111
Composition 111
Syntax 115
Parameters 115
Examples 115
Examples 125
Examples 126
Examples 129
Syntax 130
Examples 130
Introduction 132
Examples 132
Collections 132
Syntax 134
Parameters 134
Examples 135
Introduction 136
Examples 136
ApplyUsage 136
FunctorUsage 136
ArrowUsage 137
Introduction 138
Syntax 138
Examples 138
Syntax 141
Remarks 141
Examples 141
Examples 142
On Linux via dpkg 142
Remarks 144
Examples 144
Remarks 145
Examples 145
Remarks 147
Examples 147
Remarks 151
Examples 151
Syntax 153
Examples 153
Examples 154
Examples 157
Syntax 159
Examples 159
Linearization 161
Remarks 164
Examples 164
Remarks 166
Examples 166
Examples 170
Examples 173
Examples 175
Covariance 175
Invariance 175
Contravariance 175
Examples 178
Examples 180
Remarks 181
Examples 181
var 181
val 182
def 182
Functions 183
Syntax 187
Parameters 187
Remarks 187
Examples 187
Remarks 189
Examples 189
Examples 193
Examples 199
Credits 200
About
You can share this PDF with anyone you feel could benefit from it, downloaded the latest version
from: scala-language
It is an unofficial and free Scala Language ebook created for educational purposes. All the content
is extracted from Stack Overflow Documentation, which is written by many hardworking individuals
at Stack Overflow. It is neither affiliated with Stack Overflow nor official Scala Language.
The content is released under Creative Commons BY-SA, and the list of contributors to each
chapter are provided in the credits section at the end of this book. Images may be copyright of
their respective owners unless otherwise specified. All trademarks and registered trademarks are
the property of their respective company owners.
Use the content presented in this book at your own risk; it is not guaranteed to be correct nor
accurate, please send your feedback and corrections to info@zzzprojects.com
https://riptutorial.com/ 1
Chapter 1: Getting started with Scala
Language
Remarks
Scala is a modern multi-paradigm programming language designed to express common
programming patterns in a concise, elegant, and type-safe way. It smoothly integrates features of
object-oriented and functional languages.
Most given examples require a working Scala installation. This is the Scala installation page, and
this is the 'How to setup Scala' example. scalafiddle.net is a good resource for executing small
code examples over the web.
Versions
2.10.1 2013-03-13
2.10.2 2013-06-06
2.10.3 2013-10-01
2.10.4 2014-03-24
2.10.5 2015-03-05
2.10.6 2015-09-18
2.11.0 2014-04-21
2.11.1 2014-05-21
2.11.2 2014-07-24
2.11.4 2014-10-30
2.11.5 2014-01-14
2.11.6 2015-03-05
2.11.7 2015-06-23
2.11.8 2016-03-08
2.11.11 2017-04-19
https://riptutorial.com/ 2
Version Release Date
2.12.0 2016-11-03
2.12.1 2016-12-06
2.12.2 2017-04-19
Examples
Hello World by Defining a 'main' Method
object Hello {
def main(args: Array[String]): Unit = {
println("Hello World!")
}
}
Live demo
$ scalac HelloWorld.scala
To run it:
$ scala Hello
When the Scala runtime loads the program, it looks for an object named Hello with a main method.
The main method is the program entry point and is executed.
Note that, unlike Java, Scala has no requirement of naming objects or classes after the file they're
in. Instead, the parameter Hello passed in the command scala Hello refers to the object to look for
that contains the main method to be executed. It is perfectly possible to have multiple objects with
main methods in the same .scala file.
The args array will contain the command-line arguments given to the program, if any. For instance,
we can modify the program like this:
object HelloWorld {
def main(args: Array[String]): Unit = {
println("Hello World!")
for {
arg <- args
} println(s"Arg=$arg")
}
}
https://riptutorial.com/ 3
Compile it:
$ scalac HelloWorld.scala
$ scala HelloWorld 1 2 3
Hello World!
Arg=1
Arg=2
Arg=3
Live demo
By extending the App trait, you can avoid defining an explicit main method. The entire body of the
HelloWorld object is treated as "the main method".
2.11.0
Delayed Initialization
Per the official documentation, App makes use of a feature called Delayed Initialization.
This means that the object fields are initialized after the main method is called.
2.11.0
Delayed Initialization
Per the official documentation, App makes use of a feature called Delayed Initialization.
This means that the object fields are initialized after the main method is called.
DelayedInit is now deprecated for general use, but is still supported for App as a
special case. Support will continue until a replacement feature is decided upon and
implemented.
https://riptutorial.com/ 4
When using App, the body of the object will be executed as the main method, there is no need to
override main.
Scala can be used as a scripting language. To demonstrate, create HelloWorld.scala with the
following content:
println("Hello")
Execute it with the command-line interpreter (the $ is the command line prompt):
$ scala HelloWorld.scala
Hello
If you omit .scala (such as if you simply typed scala HelloWorld) the runner will look for a compiled
.class file with bytecode instead of compiling and then executing the script.
In operating systems utilizing bash or similar shell terminals, Scala scripts can be executed using a
'shell preamble'. Create a file named HelloWorld.sh and place the following as its content:
#!/bin/sh
exec scala "$0" "$@"
!#
println("Hello")
The parts between #! and !# is the 'shell preamble', and is interpreted as a bash script. The rest is
Scala.
Once you have saved the above file, you must grant it 'executable' permissions. In the shell you
can do this:
(Note that this gives permission to everyone: read about chmod to learn how to set it for more
specific sets of users.)
$ ./HelloWorld.sh
When you execute scala in a terminal without additional parameters it opens up a REPL (Read-
Eval-Print Loop) interpreter:
https://riptutorial.com/ 5
nford:~ $ scala
Welcome to Scala 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_66).
Type in expressions for evaluation. Or try :help.
scala>
The REPL allows you to execute Scala in a worksheet fashion: the execution context is preserved
and you can manually try out commands without having to build a whole program. For instance, by
typing val poem = "As halcyons we shall be" would look like this:
scala> print(poem)
As halcyons we shall be
But in the REPL you can redefine a val (which would cause an error in a normal Scala program, if
it was done in the same scope):
For the remainder of your REPL session this newly defined variable will shadow the previously
defined variable. REPLs are useful for quickly seeing how objects or other code works. All of
Scala's features are available: you can define functions, classes, methods, etc.
Scala Quicksheet
Description Code
Bind a function to a name with explicit type val f: Int => Int = (x: Int) => x * x
https://riptutorial.com/ 6
Description Code
for {
x <- Seq(1,2,3)
Nested Looping y <- Seq(4,5,6)
} print(x + ":" + y)
https://riptutorial.com/ 7
Chapter 2: Annotations
Syntax
• @AnAnnotation def someMethod = {...}
• @AnAnnotation class someClass {...}
• @AnnotatioWithArgs(annotation_args) def someMethod = {...}
Parameters
Parameter Details
Remarks
Scala-lang provides a list of standard annotations and their Java equivalents.
Examples
Using an Annotation
@deprecated
def anUnusedLegacyMethod(someArg: Any) = {
...
}
/**
* @param num Numerator
https://riptutorial.com/ 8
* @param denom Denominator
* @throws ArithmeticException in case `denom` is `0`
*/
class Division @throws[ArithmeticException](/*no annotation parameters*/) protected (num: Int,
denom: Int) {
private[this] val wrongValue = num / denom
The visibility modifier (in this case protected) should come after the annotations in the same line. In
case the annotation accepts optional parameters (as in this case @throws accepts an optional
cause), you have to specify an empty parameter list for the annotation: () before the constructor
parameters.
Note: Multiple annotations can be specified, even from the same type (repeating annotations).
Similarly with a case class without auxiliary factory method (and cause specified for the
annotation):
You can create you own Scala annotations by creating classes derived from
scala.annotation.StaticAnnotation or scala.annotation.ClassfileAnnotation
package animals
// Create Annotation `Mammal`
class Mammal(indigenous:String) extends scala.annotation.StaticAnnotation
scala>import scala.reflect.runtime.{universe ⇒ u}
https://riptutorial.com/ 9
scala>platypusSymbol.annotations
List[reflect.runtime.universe.Annotation] = List(animals.reflection.Mammal("North America"))
https://riptutorial.com/ 10
Chapter 3: Best Practices
Remarks
Prefer vals, immutable objects, and methods without side effects. Reach for them first.
Use vars, mutable objects, and methods with side effects when you have a specific
need and justification for them.
Examples
Keep it simple
Do not overcomplicate simple tasks. Most of the time you will need only:
• algebraic datatypes
• structural recursion
• monad-like api (map, flatMap, fold)
These things are not clear for newcomers: avoid using them before you understand them. Using
advanced concepts without a real need obfuscates the code, making it less maintainable.
if (userAuthorized.nonEmtpy) {
makeRequest().map {
case Success(respone) =>
someProcessing(..)
if (resendToUser) {
sendToUser(...)
}
...
}
}
https://riptutorial.com/ 11
If all your functions return Either or another Validation-like type, you can write:
for {
user <- authorizeUser
response <- requestToThirdParty(user)
_ <- someProcessing(...)
} {
sendToUser
}
By default:
• Use val, not var, wherever possible. This allows you to take seamless advantage of a
number of functional utilities, including work distribution.
• Use recursion and comprehensionss, not loops.
• Use immutable collections. This is a corrolary to using val whenever possible.
• Focus on data transformations, CQRS-style logic, and not CRUD.
• var can be used for local state (for example, inside an actor).
• mutable gives better performance in certain situations.
https://riptutorial.com/ 12
Chapter 4: Case Classes
Syntax
• case class Foo() // Case classes with no parameters must have an empty list
• case class Foo(a1: A1, ..., aN: AN) // Create a case class with fields a1 ... aN
• case object Bar // Create a singleton case class
Examples
Case Class Equality
One feature provided for free by case classes is an auto-generated equals method that checks the
value equality of all individual member fields instead of just checking the reference equality of the
objects.
The case modifier causes the Scala compiler to automatically generate common boilerplate code
for the class. Implementing this code manually is tedious and a source of errors. The following
case class definition:
https://riptutorial.com/ 13
def productElement(i: Int): Any = i match {
case 0 => name
case 1 => age
case _ => throw new IndexOutOfBoundsException(i.toString)
}
When applied to an object, the case modifier has similar (albeit less dramatic) effects. Here the
primary gains are a toString implementation and a hashCode value that is consistent across
processes. Note that case objects (correctly) use reference equality:
It is still possible to manually implement methods that would otherwise be provided by the case
https://riptutorial.com/ 14
modifier in both the class itself and its companion object.
• All constructor arguments are public and can be accessed on initialized objects (normally
this is not the case, as demonstrated here):
• It provides an implementation for the following methods: toString, equals, hashCode (based on
properties), copy, apply and unapply:
sealed trait Animal // `sealed` modifier allows inheritance within current build-unit
only
case class Dog(age: Int) extends Animal
case class Cat(owner: String) extends Animal
val x: Animal = Dog(18)
x match {
case Dog(x) => println(s"It's a $x years old dog.")
case Cat(x) => println(s"This cat belongs to $x.")
}
The Scala compiler prefixes every argument in the parameter list by default with val. This means
that, by default, case classes are immutable. Each parameter is given an accessor method, but
there are no mutator methods. For example:
Declaring a parameter in a case class as var overrides the default behavior and makes the case
class mutable:
https://riptutorial.com/ 15
case class Bar(var i: Int)
Another instance when a case class is 'mutable' is when the value in the case class is mutable:
import scala.collection._
Note that the 'mutation' that is occurring here is in the map that m points to, not to m itself. Thus, if
some other object had m as a member, it would see the change as well. Note how in the following
example changing instanceA also changes instanceB:
import scala.collection.mutable
Case classes provide a copy method that creates a new object that shares the same fields as the
old one, with certain changes.
We can use this feature to create a new object from a previous one that has some of the same
characteristics. This simple case class to demonstrates this feature:
case class Person(firstName: String, lastName: String, grade: String, subject: String)
val putu = Person("Putu", "Kevin", "A1", "Math")
val mark = putu.copy(firstName = "Ketut", lastName = "Mark")
// mark: People = People(Ketut,Mark,A1,Math)
In this example we can see that the two objects share similar characteristics (grade = A1, subject =
Math), except where they have been specified in the copy (firstName and lastName).
In order to achieve type safety sometimes we want to avoid the use of primitive types on our
domain. For instance, imagine a Person with a name. Typically, we would encode the name as a
https://riptutorial.com/ 16
String.However, it would not be hard to mix a String representing a Person's name with a String
representing an error message:
To avoid such pitfalls you can encode the data like this:
and now our code will not compile if we mix PersonName with ErrorMessage, or even an ordinary
String.
But this incurs a small runtime overhead as we now have to box/unbox Strings to/from their
PersonName containers. In order to avoid this, one can make PersonName and ErrorMessage value
classes:
https://riptutorial.com/ 17
Chapter 5: Classes and Objects
Syntax
• class MyClass{} // curly braces are optional here as class body is empty
• class MyClassWithMethod {def method: MyClass = ???}
• new MyClass() //Instantiate
• object MyObject // Singleton object
• class MyClassWithGenericParameters[V1, V2](vl: V1, i: Int, v2: V2)
• class MyClassWithImplicitFieldCreation[V1](val v1: V1, val i: Int)
• new MyClassWithGenericParameters(2.3, 4, 5) or with a different type: new
MyClassWithGenericParameters[Double, Any](2.3, 4, 5)
• class MyClassWithProtectedConstructor protected[my.pack.age](s: String)
Examples
Instantiate Class Instances
A class in Scala is a 'blueprint' of a class instance. An instance contains the state and behavior as
defined by that class. To declare a class:
class MyClass{} // curly braces are optional here as class body is empty
or:
Parentheses are optional in Scala for creating objects from a class that has a no-argument
constructor. If a class constructor takes arguments:
Here MyClass requires one Int argument, which can only be used internally to the class. arg cannot
be accessed outside MyClass unless it is declared as a field:
https://riptutorial.com/ 18
Alternatively it can be declared public in the constructor:
class MyClass(val arg : Int) // Class definition with arg declared public
var instance = new MyClass(2) // Instance instantiation
instance.arg //arg is now visible to clients
class MyClass
But, if not paid attention, in some cases optional parenthesis may produce some unexpected
behavior. Suppose we want to create a task that should run in a separate thread. Below is the
sample code:
We may think that this sample code if executed will print Performing task., but to our surprise, it
won't print anything. Let's see what's happening here. If you pay a closer look, we have used curly
braces {}, right after new Thread. It created an annonymous class which extends Thread:
And then in the body of this annonymous class, we defined our task (again creating an
annonymous class implementing Runnable interface). So we might have thought that we used
public Thread(Runnable target) constructor but in fact (by ignoring optional ()) we used public
Thread() constructor with nothing defined in the body of run() method. To rectify the problem, we
need to use parenthesis instead of curly braces.
https://riptutorial.com/ 19
override def run(): Unit = {
// perform task
println("Performing task.")
}
}
)
Singleton Objects
Scala supports static members, but not in the same manner as Java. Scala provides an alternative
to this called Singleton Objects. Singleton objects are similar to a normal class, except they can
not be instantiated using the new keyword. Below is a sample singleton class:
object Factorial {
private val cache = Map[Int, Int]()
def getCache = cache
}
Note that we have used object keyword to define singleton object (instead of 'class' or 'trait').
Since singleton objects can not be instantiated they can not have parameters. Accessing a
singleton object looks like this:
Note that this looks exactly like accessing a static method in a Java class.
Companion Objects
In Scala singleton objects may share the name of a corresponding class. In such a scenario the
singleton object is referred to as a Companion Object. For instance, below the class Factorial is
defined, and a companion object (also named Factorial) is defined below it. By convention
companion objects are defined in the same file as their companion class.
def fact(num : Int) : Int = if (num <= 1) 1 else (num * fact(num - 1))
Factorial.cache(num)
}
}
https://riptutorial.com/ 20
object Factorial {
private val cache = scala.collection.mutable.Map[Int, Int]()
}
In this example we are using a private cache to store factorial of a number to save calculation time
for repeated numbers.
Here object Factorial is a companion object and class Factorial is its corresponding companion
class. Companion objects and classes can access each other's private members. In the example
above Factorial class is accessing the private cache member of it's companion object.
Note that a new instantiation of the class will still utilize the same companion object, so any
modification to member variables of that object will carry over.
Objects
Whereas Classes are more like blueprints, Objects are static (i.e. already instantiated):
object Dog {
def bark: String = "Raf"
}
They are often used as a companion to a class, they allow you to write:
object Dog {
def apply(name: String): Dog = new Dog(name)
}
variable match {
case _: Type => true
case _ => false
}
https://riptutorial.com/ 21
Both isInstanceOf and pattern matching are checking only the object's type, not its generic
parameter (no type reification), except for arrays:
But
variable match {
case _: Type => true
}
Examples:
x match {
case _: java.lang.Integer => true//better: do something
case _ => false
https://riptutorial.com/ 22
} //> res1: Boolean = true
Remark: This is only about the behaviour on the JVM, on other platforms (JS, native) type
casting/checking might behave differently.
Constructors
Primary Constructor
In Scala the primary constructor is the body of the class. The class name is followed by a
parameter list, which are the constructor arguments. (As with any function, an empty parameter list
may be omitted.)
class Bar {
...
}
The construction parameters of an instance are not accessible outside its constructor body unless
marked as an instance member by the val keyword:
Any operations that should be performed when an instance of an object is instantiated are written
directly in the body of the class:
class DatabaseConnection
(host: String, port: Int, username: String, password: String) {
https://riptutorial.com/ 23
/* first connect to the DB, or throw an exception */
private val driver = new AwesomeDB.Driver()
driver.connect(host, port, username, password)
def isConnected: Boolean = driver.isConnected
...
}
Note that it is considered good practice to put as few side effects into the constructor as possible;
instead of the above code, one should consider having connect and disconnect methods so that
consumer code is responsible for scheduling IO.
Auxiliary Constructors
A class may have additional constructors called 'auxiliary constructers'. These are defined by
constructor definitions in the form def this(...) = e, where e must invoke another constructor:
// usage:
new Person("Grace Hopper").fullName // returns Grace Hopper
new Person("Grace", "Hopper").fullName // returns Grace Hopper
This implies each constructor can have a different modifier: only some may be available publicly:
In this way you can control how consumer code may instantiate the class.
https://riptutorial.com/ 24
Chapter 6: Collections
Examples
Sort A List
The default behavior of sorted() is to use math.Ordering, which for strings results in a lexographic
sort:
names.sorted
// results in: List(Alana, Allie, Beth, Kathryn, Serin)
sortWith allows you to provide your own ordering utilizing a comparison function:
names.sorted.reverse
//results in: List(Serin, Kathryn, Beth, Allie, Alana)
Lists can also be sorted using Java method java.util.Arrays.sort and its Scala wrapper
scala.util.Sorting.quickSort
java.util.Arrays.sort(data)
scala.util.Sorting.quickSort(data)
These methods can improve performance when sorting larger collections if the collection
conversions and unboxing/boxing can be avoided. For a more detailed discussion on the
performance differences, read about Scala Collection sorted, sortWith and sortBy Performance.
https://riptutorial.com/ 25
Create a List containing n copies of x
To create a collection of n copies of some object x, use the fill method. This example creates a
List, but this can work with other collections for which fill makes sense:
// List.fill(n)(x)
scala > List.fill(3)("Hello World")
res0: List[String] = List(Hello World, Hello World, Hello World)
List creation
Take element
Prepend Elements
0 :: List(1, 2, 3) // List(0, 1, 2, 3)
Append Elements
Common operations
https://riptutorial.com/ 26
List(1, 2, 3).find(_ == 3) // Some(3)
List(1, 2, 3).map(_ * 2) // List(2, 4, 6)
List(1, 2, 3).filter(_ % 2 == 1) // List(1, 3)
List(1, 2, 3).fold(0)((acc, i) => acc + i * i) // 1 * 1 + 2 * 2 + 3 * 3 = 14
List(1, 2, 3).foldLeft("Foo")(_ + _.toString) // "Foo123"
List(1, 2, 3).foldRight("Foo")(_ + _.toString) // "123Foo"
Note that this deals with the creation of a collection of type Map, which is distinct from
the map method.
Map Creation
Map[String, Int]()
val m1: Map[String, Int] = Map()
val m2: String Map Int = Map()
A map can be considered a collection of tuples for most operations, where the first element is the
key and the second is the value.
Get element
m.get("a") // Some(1)
m.get("d") // None
m("a") // 1
m("d") // java.util.NoSuchElementException: key not found: d
m.keys // Set(a, b, c)
m.values // MapLike(1, 2, 3)
Add element(s)
Map("a" -> 1, "b" -> 2) + ("c" -> 3) // Map(a -> 1, b -> 2, c -> 3)
Map("a" -> 1, "b" -> 2) + ("a" -> 3) // Map(a -> 3, b -> 2)
Map("a" -> 1, "b" -> 2) ++ Map("b" -> 3, "c" -> 4) // Map(a -> 1, b -> 3, c -> 4)
Common operations
In operations where an iteration over a map occurs (map, find, forEach, etc), the elements of the
collection are tuples. The function parameter can either use the tuple accessors (_1, _2), or a
partial function with a case block:
https://riptutorial.com/ 27
m.filter(_._2 == 2) // Map(b -> 2)
m.foldLeft(0){
case (acc, (key, value: Int)) => acc + value
} // 6
Map
'Mapping' across a collection uses the map function to transform each element of that collection in a
similar way. The general syntax is:
// Initialize
val list = List(1,2,3)
// list: List[Int] = List(1, 2, 3)
// Apply map
list.map((item: Int) => item*2)
// res0: List[Int] = List(2, 4, 6)
Filter
filteris used when you want to exclude or 'filter out' certain elements of a collection. As with map,
the general syntax takes a function, but that function must return a Boolean:
https://riptutorial.com/ 28
val list = 1 to 10 toList
// list: List[Int] = List(1, 2, 3, 4, 5, 6, 7, 8, 9, 10)
The Scala Collections framework, according to its authors, is designed to be easy to use, concise,
safe, fast, and universal.
The framework is made up of Scala traits that are designed to be building blocks for creating
collections. For more information on these building blocks, read the official Scala collections
overview.
These built-in collections are separated into the immutable and mutable packages. By default, the
immutable versions are used. Constructing a List() (without importing anything) will construct an
immutable list.
One of the most powerful features of the framework is the consistent and easy-to-use interface
across like-minded collections. For example, summing all elements in a collection is the same for
Lists, Sets, Vectors, Seqs and Arrays:
https://riptutorial.com/ 29
numSet.reduce((n1, n2) => n1 + n2) // 15
Traversable types
Collection classes that have the Traversable trait implement foreach and inherit many methods for
performing common operations to collections, which all function identically. The most common
operations are listed here:
• Map - map, flatMap, and collect produce new collections by applying a function to each
element in the original collection.
// split list of letters into individual strings and put them into the same list
List("a b c", "d e").flatMap(letters => letters.split(" ")) // = List("a", "b", "c", "d", "e")
• Conversions - toList, toArray, and many other conversion operations change the current
collection into a more specific kind of collection. These are usually methods prepended with
'to' and the more specific type (i.e. 'toList' converts to a List).
val array: Array[Int] = List[Int](1, 2, 3).toArray // convert list of ints to array of ints
• Size info - isEmpty, nonEmpty, size, and hasDefiniteSize are all metadata about the set. This
allows conditional operations on the collection, or for code to determine the size of the
collection, including whether it's infinite or discrete.
List().isEmpty // true
List(1).nonEmpty // true
• Element retrieval - head, last, find, and their Option variants are used to retrieve the first or
last element, or find a specific element in the collection.
• Sub-collection retrieval operations - filter, tail, slice, drop, and other operations allow for
choosing parts of the collection to operate on further.
https://riptutorial.com/ 30
List(-2, -1, 0, 1, 2).filter(num => num > 0) // = List(1, 2)
• Subdivision operations - partition, splitAt, span, and groupBy split the current collection into
different parts.
• Element tests - exists, forall, and count are operations used to check this collection to see if
it satisfies a predicate.
List(1, 2, 3, 4).forall(num => num > 0) // = true, all numbers are positive
List(-3, -2, -1, 1).forall(num => num < 0) // = false, not all numbers are negative
• Folds - foldLeft (/:), foldRight (:\), reduceLeft, and reduceRight are used to apply binary
functions to successive elements in the collection. Go here for fold examples and go here for
reduce examples.
Fold
The fold method iterates over a collection, using an initial accumulator value and applying a
function that uses each element to update the accumulator successfully:
In the above example, an anonymous function was supplied to fold(). You can also use a named
function that takes two arguments. Bearing this in my, the above example can be re-written thus:
initialValue = 2;
sum = nums.fold(initialValue){
(accumulator,currentElementBeingIterated) => accumulator + currentElementBeingIterated
}
println(sum) //prints 17 because 2+1+2+3+4+5 = 17
foldLeft() iterates from left to right (from the first element of the collection to the last in that order).
foldRight()
https://riptutorial.com/ 31
iterates from right to left (from the last element to the first element). fold() iterates from left to right
like foldLeft(). In fact, fold() actually calls foldLeft() internally.
def fold[A1 >: A](z: A1)(op: (A1, A1) => A1): A1 = foldLeft(z)(op)
fold(), foldLeft()and foldRight() will return a value that has the same type with the initial value it
takes. However, unlike foldLeft() and foldRight(), the initial value given to fold() can only be of
the same type or a supertype of the type of the collection.
In this example the order is not relevant, so you can change fold() to foldLeft() or foldRight()
and the result will remain the same. Using a function that is sensitive to order will alter results.
Foreach
foreach is unusual among the collections iterators in that it does not return a result. Instead it
applies a function to each element that has only side effects. For example:
The function supplied to foreach can have any return type, but the result will be discarded.
Typically foreach is used when side effects are desirable. If you want to transform data consider
using map, filter, a for comprehension, or another option.
Reduce
The reduce(), reduceLeft() and reduceRight methods are similar to folds. The function passed to
reduce takes two values and yields a third. When operating on a list, the first two values are the
first two values in the list. The result of the function and the next value in the list are then re-
applied to the function, yielding a new result. This new result is applied with the next value of the
list and so on until there are no more elements. The final result is returned.
https://riptutorial.com/ 32
def findLongest(nameA:String, nameB:String):String = {
if (nameA.length > nameB.length) nameA else nameB
}
There are some differences in how the reduce functions work as compared to the fold functions.
They are:
https://riptutorial.com/ 33
Chapter 7: Continuations Library
Introduction
Continuation passing style is a form of control flow that involves passing to functions the rest of the
computation as a "continuation" argument. The function in question later invokes that continuation
to continue program execution. One way to think of a continuation is as a closure. The Scala
continuations library brings delimited continuations in the form of the primitives shift/reset to the
language.
Syntax
• reset { ... } // Continuations extend up to the end of the enclosing reset block
• shift { ... } // Create a continuation stating from after the call, passing it to the closure
• A @cpsParam[B, C] // A computation that requires a function A => B to create a value of C
• @cps[A] // Alias for @cpsParam[A, A]
• @suspendable // Alias for @cpsParam[Unit, Unit]
Remarks
shift and reset are primitive control flow structures, like Int.+ is a primitive operation and Long is a
primitive type. They are more primitive than either in that delimited continuations can actually be
used to construct almost all control flow structures. They are not very useful "out-of-the-box", but
they truly shine when they are used in libraries to create rich APIs.
Continuations and monads are also closely linked. Continuations can be made into the
continuation monad, and monads are continuations because their flatMap operation takes a
continuation as parameter.
Examples
Callbacks are Continutations
The function argument to readFile is a continuation, in that readFile invokes it to continue program
https://riptutorial.com/ 34
execution after it has done its job.
In order to rein in what can easily become callback hell, we use the continuations library.
// After compilation, shift and reset are transformed back into closures
// The for comprehension first desugars to:
reset {
shift(readFile(path1)).flatMap { file1 => shift(readFile(path2)).foreach { file2 =>
processFiles(file1, file2) } }
}
// And then the callbacks are restored via CPS transformation
readFile(path1) { _.flatMap { file1 => // We see how shift moves the code after it into a
closure
readFile(path2) { _.foreach { file2 =>
processFiles(file1, file2)
}}
}} // And we see how reset closes all those closures
// And it looks just like the old version!
If shift is called outside of a delimiting reset block, it can be used to create functions that
themselves create continuations inside a reset block. It is important to note that shift's type is not
just (((A => B) => C) => A), it is actually (((A => B) => C) => (A @cpsParam[B, C])). That annotation
marks where CPS transformations are needed. Functions that call shift without reset have their
return type "infected" with that annotation.
Inside a reset block, a value of A @cpsParam[B, C] seems to have a value of A, though really it's just
pretending. The continuation that is needed to complete the computation has type A => B, so the
code following a method that returns this type must return B. C is the "real" return type, and after
CPS transformation the function call has the type C.
https://riptutorial.com/ 35
Now, the example, taken from the Scaladoc of the library
Here, ask will store the continuation into a map, and later some other code can retrieve that
"session" and pass in the result of the query to the user. In this way, go can actually be using an
asynchronous library while its code looks like normal imperative code.
https://riptutorial.com/ 36
Chapter 8: Currying
Syntax
• aFunction(10)_ //Using '_' Tells the compiler that all the parameters in the rest of the
parameter groups will be curried.
• nArityFunction.curried //Converts an n-arity Function to an equivalent curried version
• anotherFunction(x)(_: String)(z) // Currying an arbitrary parameter. It needs its type explicitly
stated.
Examples
A configurable multiplier as a curried function
switchBetween3AndE(true) // "3"
switchBetween3AndE(false) // "E"
numberMinus5(7) // 2
fiveMinusNumber(7) // -2
https://riptutorial.com/ 37
Currying
Currying add transforms it into a function that takes one Int and returns a function (from one Int
to an Int)
You can apply this concept to any function that takes multiple arguments. Currying a function that
takes multiple arguments, transforms it into a series of applications of functions that take one
argument:
val x = add3Curr(1)(2)(42)
Currying
is the technique of translating the evaluation of a function that takes multiple arguments
into evaluating a sequence of functions.
Concretely, in terms of scala types, in the context of a function that take two arguments, (has arity
2) it is the conversion of
val f: (A, B) => C // a function that takes two arguments of type `A` and `B` respectively
// and returns a value of type `C`
to
val curriedF: A => B => C // a function that take an argument of type `A`
// and returns *a function*
// that takes an argument of type `B` and returns a `C`
https://riptutorial.com/ 38
}
Usage:
1. You can write curried functions as methods. so curriedF can be written as:
2. You can un-curry (i.e. go from A => B => C to (A, B) => C) using a standard library method:
Function.uncurried
Currying is the technique of translating the evaluation of a function that takes multiple arguments
into evaluating a sequence of functions, each with a single argument.
Example 1
Let's assume that the total yearly income is a function composed by the income and a bonus:
Note in the above definition that the type can be also viewed/written as:
https://riptutorial.com/ 39
val partialTotalYearlyIncome: Int => Int = totalYearlyIncomeCurried(10000)
partialTotalYearlyIncome(100)
Example 2
Let's assume that the car manufacturing involves the application of car wheels and car body:
class CarWheelsFactory {
def applyCarWheels(carManufacturing:(String,String) => String): String => String =
carManufacturing.curried("applied wheels..")
}
class CarBodyFactory {
def applyCarBody(partialCarWithWheels: String => String): String =
partialCarWithWheels("applied car body..")
}
Notice that the CarWheelsFactory above curries the car manufacturing function and only applies the
wheels.
The car manufacturing process then will take the below form:
What we have is a list of credit cards and we'd like to calculate the premiums for all those cards
that the credit card company has to pay out. The premiums themselves depend on the total
number of credit cards, so that the company adjust them accordingly.
We already have a function that calculates the premium for a single credit card and takes into
account the total cards the company has issued:
object CreditCard {
def getPremium(totalCards: Int, creditCard: CreditCard): Double = { ... }
https://riptutorial.com/ 40
}
Now a reasonable approach to this problem would be to map each credit card to a premium and
reduce it to a sum. Something like this:
However the compiler isn't going to like this, because CreditCard.getPremium requires two
parameters. Partial application to the rescue! We can partially apply the total number of credit
cards and use that function to map the credit cards to their premiums. All we need to do is curry
the getPremium function by changing it to use multiple parameter lists and we're good to go.
object CreditCard {
def getPremium(totalCards: Int)(creditCard: CreditCard): Double = { ... }
}
https://riptutorial.com/ 41
Chapter 9: Dependency Injection
Examples
Cake Pattern with inner implementation class.
class TimeUtilImpl{
def now() = new DateTime()
}
}
class MainControllerImpl {
def printCurrentTime() = println(timeUtil.now()) //timeUtil is injected from TimeUtil
trait
}
}
app.mainController.printCurrentTime()
}
The most important syntax is the self-annotation (_: TimeUtil =>) which is to inject TimeUtil into
MainController. In another word, MainController depends on TimeUtil.
I use inner class (e.g. TimeUtilImpl) in each component because, in my opinion, that it is easier for
testing as we can mock the inner class. And it is also easier for tracing where the method is called
from when project grows more complex.
Lastly, I wire all component together. If you are familiar with Guice, this is equivalent to Binding
https://riptutorial.com/ 42
Chapter 10: Dynamic Invocation
Introduction
Scala allows you to use dynamic invocation when calling methods or accessing fields on an
object. Instead of having this built deep into the language, this is accomplished through rewriting
rules similar to those of implicit conversions, enabled by the marker trait [scala.Dynamic][Dynamic
scaladoc]. This allows you to emulate the ability to dynamically add properties to objects present in
dynamic languages, and more. [Dynamic scaladoc]: http://www.scala-
lang.org/api/2.12.x/scala/Dynamic.html
Syntax
• class Foo extends Dynamic
• foo.field
• foo.field = value
• foo.method(args)
• foo.method(namedArg = x, y)
Remarks
In order to declare subtypes of Dynamic, the language feature dynamics must be enabled, either by
importing scala.language.dynamics or by the -language:dynamics compiler option. Users of this
Dynamic who do not define their own subtypes do not need to enable this.
Examples
Field Accesses
This:
https://riptutorial.com/ 43
foo.field = 10 // Becomes foo.updateDynamic("field")(10)
foo.field = "10" // Does not compile; "10" is not an Int.
foo.x() // Does not compile; Foo does not define applyDynamic, which is used for methods.
foo.x.apply() // DOES compile, as Nothing is a subtype of () => Any
// Remember, the compiler is still doing static type checks, it just has one more way to
// "recover" and rewrite otherwise invalid code now.
Method Calls
This:
Slightly counterintuitively (but also the only sane way to make it work), this:
is equivalent to:
dyn.selectDynamic("x").update(y, z)
while
dyn.x(y)
is still
dyn.applyDynamic("x")(y)
It is important to be aware of this, or else it may sneak by unnoticed and cause strange errors.
https://riptutorial.com/ 44
Read Dynamic Invocation online: https://riptutorial.com/scala/topic/8296/dynamic-invocation
https://riptutorial.com/ 45
Chapter 11: Enumerations
Remarks
Approach with sealed trait and case objects is preferred because Scala enumeration has a few
problems:
isWeekendWithBug(WeekDays.Fri)
scala.MatchError: Fri (of class scala.Enumeration$Val)
Compare with:
Examples
Days of the week using Scala Enumeration
isWeekend(WeekDays.Sun)
res0: Boolean = true
https://riptutorial.com/ 46
object WeekDays extends Enumeration {
val Mon = Value("Monday")
val Tue = Value("Tuesday")
val Wed = Value("Wednesday")
val Thu = Value("Thursday")
val Fri = Value("Friday")
val Sat = Value("Saturday")
val Sun = Value("Sunday")
}
println(WeekDays.Mon)
>> Monday
WeekDays.withName("Monday") == WeekDays.Mon
>> res0: Boolean = true
Beware of the not-so-typesafe behavior, wherein different enumerations can evaluate as the same
instance type:
WeekDays.Mon.isInstanceOf[Parity.Value]
>> res1: Boolean = true
object WeekDay {
case object Mon extends WeekDay
case object Tue extends WeekDay
case object Wed extends WeekDay
case object Thu extends WeekDay
case object Fri extends WeekDay
case object Sun extends WeekDay
case object Sat extends WeekDay
}
The sealed keyword guarantees that the trait WeekDay cannot be extended in another file. This
allows the compiler to make certain assumptions, including that all possible values of WeekDay are
already enumerated.
One drawback is that this method does not allow you to obtain a list of all possible values. To get
such a list it must be provided explicitly:
Case classes can also extend a sealed trait. Thus, objects and case classes can be mixed to
create complex hierarchies:
https://riptutorial.com/ 47
sealed trait CelestialBody
object CelestialBody {
case object Earth extends CelestialBody
case object Sun extends CelestialBody
case object Moon extends CelestialBody
case class Asteroid(name: String) extends CelestialBody
}
Another drawback is that there is no way to access a the variable name of a sealed object's
enumeration, or search by it. If you need some kind of name associated to each value, it must be
manually defined:
object WeekDay {
case object Mon extends WeekDay { val name = "Monday" }
case object Tue extends WeekDay { val name = "Tuesday" }
(...)
}
Or just:
object WeekDay {
object Mon extends WeekDay("Monday")
object Tue extends WeekDay("Tuesday")
(...)
}
This is just an extension on the sealed trait variant where a macro generates a set with all
instances at compile time. This nicely omits the drawback that a developer can add a value to the
enumeration but forget to add it to the allElements set.
import EnumerationMacros._
https://riptutorial.com/ 48
import scala.collection.immutable.TreeSet
import scala.language.experimental.macros
import scala.reflect.macros.blackbox
/**
A macro to produce a TreeSet of all instances of a sealed trait.
Based on Travis Brown's work:
http://stackoverflow.com/questions/13671734/iteration-over-a-sealed-trait-in-scala
CAREFUL: !!! MUST be used at END OF code block containing the instances !!!
*/
object EnumerationMacros {
def sealedInstancesOf[A]: TreeSet[A] = macro sealedInstancesOf_impl[A]
if (!symbol.isClass || !symbol.isSealed)
c.abort(c.enclosingPosition, "Can only enumerate values of a sealed trait or class.")
else {
Apply(
Select(
reify(TreeSet).tree,
TermName("apply")
),
children.map(sourceModuleRef(_))
)
}
}
}
}
https://riptutorial.com/ 49
Chapter 12: Error Handling
Examples
Try
import scala.util.Try
Either
response.webService.status match {
case 200 => {
val person = parsePerson(response)
if(!isValid(person)) Left("Validation failed")
else Right(person)
}
getPersonFromWebService("http://some-webservice.com/person") match {
case Left(errorMessage) => println(errorMessage)
https://riptutorial.com/ 50
case Right(person) => println(person.surname)
}
Option
The use of null values is strongly discouraged, unless interacting with legacy Java code that
expects null. Instead, Option should be used when the result of a function might either be
something (Some) or nothing (None).
A try-catch block is more appropriate for error-handling, but if the function might legitimately return
nothing, Option is appropriate to use, and simple.
If no person is found, None can be returned. Otherwise, an object of type Some containing a Person
object is returned. What follows are ways to handle an object of type Option.
Pattern Matching
findPerson(personName) match {
case Some(person) => println(person.surname)
case None => println(s"No person found with name $personName")
}
Using fold
Converting to Java
If you need to convert an Option type to a null-able Java type for interoperability:
https://riptutorial.com/ 51
val s: Option[String] = Option("hello")
s.orNull // "hello": String
s.getOrElse(null) // "hello": String
When an exception is thrown from within a Future, you can (should) use recover to handle it.
For instance,
...will throw an Exception from within the Future. But seeing as we can predict that an Exception of
type FairlyStupidException with a high probability, we can specifically handle this case in an
elegant way:
As you can see the method given to recover is a PartialFunction over the domain of all Throwable,
so you can handle just a certain few types and then let the rest go into the ether of exception
handling at higher levels in the Future stack.
Note that this is similar to running the following code in a non-Future context:
try {
runNotFuture
} catch {
case e: FairlyStupidException => BadRequest("Another stupid exception!")
}
It is really important to handle exceptions generated within Futures because much of the time they
are more insidious. They don't get all in your face usually, because they run in a different
execution context and thread, and thus do not prompt you to fix them when they happen,
especially if you don't notice anything in logs or the behavior of the application.
In addition to functional constructs such as Try, Option and Either for error handling, Scala also
supports a syntax similar to Java's, using a try-catch clause (with a potential finally block as well).
The catch clause is a pattern match:
https://riptutorial.com/ 52
try {
// ... might throw exception
} catch {
case ioe: IOException => ... // more specific cases first
case e: Exception => ...
// uncaught types will be thrown
} finally {
// ...
}
To convert exceptions into Either or Option types, you can use methods that provided in
scala.util.control.Exception
import scala.util.control.Exception._
https://riptutorial.com/ 53
Chapter 13: Extractors
Syntax
• val extractor(extractedValue1, _ /* ignored second extracted value */) = valueToBeExtracted
• valueToBeExtracted match { case extractor(extractedValue1, _) => ???}
• val (tuple1, tuple2, tuple3) = tupleWith3Elements
• object Foo { def unapply(foo: Foo): Option[String] = Some(foo.x); }
Examples
Tuple Extractors
To unpack an extractor:
Note that tuples have a maximum length of 22, and thus ._1 through ._22 will work (assuming the
tuple is at least that size).
Tuple extractors may be used to provide symbolic arguments for literal functions:
assert {
names ==
(persons map { name =>
s"${name._2}, ${name._1}"
})
}
assert {
names ==
(persons map { case (given, surname) =>
s"$surname, $given"
https://riptutorial.com/ 54
})
}
A case class is a class with a lot of standard boilerplate code automatically included. One benefit
of this is that Scala makes it easy to use extractors with case classes.
case class Person(name: String, age: Int) // Define the case class
val p = Person("Paola", 42) // Instantiate a value with the case class type
At this juncture, both n and a are vals in the program and can be accessed as such: they are said
to have been 'extracted' from p. Continuing:
• Extraction can happen at 'deep' levels: properties of nested objects can be extracted.
• Not all elements need to be extracted. The wildcard _ character indicates that that particular
value can be anything, and is ignored. No val is created.
Here, we have code that uses the extractor to explicitly check that person is a Person object and
immediately pull out the variables that we care about: n and a.
A custom extraction can be written by implementing the unapply method and returning a value of
type Option:
object Foo {
def unapply(foo: Foo): Option[String] = Some(foo.x)
}
https://riptutorial.com/ 55
new Foo("42") match {
case Foo(x) => x
}
// "42"
The return type of unapply may be something other than Option, provided the type returned
provides get and isEmpty methods. In this example, Bar is defined with those methods, and unapply
returns an instance of Bar:
object Bar {
def unapply(bar: Bar): Bar = bar
}
The return type of unapply can also be a Boolean, which is a special case that does not carry the get
and isEmpty requirements above. However, note in this example that DivisibleByTwo is an object,
not a class, and does not take a parameter (and therefore that parameter cannot be bound):
object DivisibleByTwo {
def unapply(num: Int): Boolean = num % 2 == 0
}
4 match {
case DivisibleByTwo() => "yes"
case _ => "no"
}
// yes
3 match {
case DivisibleByTwo() => "yes"
case _ => "no"
}
// no
Remember that unapply goes in the companion object of a class, not in the class. The example
above will be clear if you understand this distinction.
If a case class has exactly two values, its extractor can be used in infix notation.
https://riptutorial.com/ 56
//x: String = hello
//y: String = world
object Foo {
def unapply(s: String): Option[(Int, Int)] = Some((s.length, 5))
}
val a Foo b = "hello world!"
//a: Int = 12
//b: Int = 5
Regex Extractors
Transformative extractors
Extractor behavior can be used to derive arbitrary values from their input. This can be useful in
scenarios where you want to be able to act on the results of a transformation in the event that the
transformation is successful.
Consider as an example the various user name formats usable in a Windows environment:
object UserPrincipalName {
def unapply(str: String): Option[(String, String)] = str.split('@') match {
case Array(u, d) if u.length > 0 && d.length > 0 => Some((u, d))
case _ => None
}
}
object DownLevelLogonName {
def unapply(str: String): Option[(String, String)] = str.split('\\') match {
case Array(d, u) if u.length > 0 && d.length > 0 => Some((d, u))
case _ => None
}
}
https://riptutorial.com/ 57
case _ => None
}
In fact it is possible to create an extractor exhibiting both behaviors by broadening the types it can
match:
object UserPrincipalName {
def unapply(obj: Any): Option[(String, String)] = obj match {
case upn: UserPrincipalName => Some((upn.username, upn.domain))
case str: String => str.split('@') match {
case Array(u, d) if u.length > 0 && d.length > 0 => Some((u, d))
case _ => None
}
case _ => None
}
}
In general, extractors are simply a convenient reformulation of the Option pattern, as applied to
methods with names like tryParse:
UserPrincipalName.unapply("user@domain") match {
case Some((u, d)) => ???
case None => ???
}
https://riptutorial.com/ 58
Chapter 14: For Expressions
Syntax
• for {clauses} body
• for {clauses} yield body
• for (clauses) body
• for (clauses) yield body
Parameters
Parameter Details
clauses The iteration and filters over which the for works.
Use this if you want to create or 'yield' a collection. Using yield will cause the
yield
return type of the for to be a collection instead of Unit.
Examples
Basic For Loop
This demonstrates iterating a variable, x, from 1 to 10 and doing something with that value. The
return type of this for comprehension is Unit.
This demonstrates a filter on a for-loop, and the use of yield to create a 'sequence
comprehension':
for ( x <- 1 to 10 if x % 2 == 0)
yield x
https://riptutorial.com/ 59
A for comprehension is useful when you need to create a new collection based on the iteration
and it's filters.
for {
x <- 1 to 2
y <- 'a' to 'd'
} println("(" + x + "," + y + ")")
(Note that to here is an infix operator method that returns an inclusive range. See the definition
here.)
(1,a)
(1,b)
(1,c)
(1,d)
(2,a)
(2,b)
(2,c)
(2,d)
for (
x <- 1 to 2
y <- 'a' to 'd'
) println("(" + x + "," + y + ")")
In order to get all of the combinations into a single vector, we can yield the result and set it to a
val:
val a = for {
x <- 1 to 2
y <- 'a' to 'd'
} yield "(%s,%s)".format(x, y)
// a: scala.collection.immutable.IndexedSeq[String] = Vector((1,a), (1,b), (1,c), (1,d),
(2,a), (2,b), (2,c), (2,d))
If you have several objects of monadic types, we can achieve combinations of the values using a
'for comprehension':
for {
x <- Option(1)
y <- Option("b")
z <- List(3, 4)
https://riptutorial.com/ 60
} {
// Now we can use the x, y, z variables
println(x, y, z)
x // the last expression is *not* the output of the block in this case!
}
// This prints
// (1, "b", 3)
// (1, "b", 4)
If the objects are of the same monadic type M (e.g. Option) then using yield will return an object of
type M instead of Unit.
val a = for {
x <- Option(1)
y <- Option("b")
} yield {
// Now we can use the x, y variables
println(x, y)
// whatever is at the end of the block is the output
(7 * x, y)
}
// This prints:
// (1, "b")
// The val `a` is set:
// a: Option[(Int, String)] = Some((7,b))
Note that the yield keyword cannot be used in the original example, where there is a mix of
monadic types (Option and List). Trying to do so will yield a compile-time type mismatch error.
for comprehensions in Scala are just syntactic sugar. These comprehensions are implemented
using the withFilter, foreach, flatMap and map methods of their subject types. For this reason, only
types that have these methods defined can be utilized in a for comprehension.
https://riptutorial.com/ 61
A for comprehension of the following form, with patterns pN, generators gN and conditions cN:
... will de-sugar to nested calls using withFilter and either flatMap or map:
(Note that map is used in the innermost comprehension, and flatMap is used in every outer
comprehension.)
A for comprehension can be applied to any type implementing the methods required by the de-
sugared representation. There are no restrictions on the return types of these methods, so long as
they are composable.
https://riptutorial.com/ 62
Chapter 15: Functions
Remarks
Scala has first-class functions.
• Functions are compiled to a class extending a trait (such as Function1) at compile-time, and
are instantiated to a value at runtime. Methods, on the other hand, are members of their
class, trait or object, and do not exist outside of that.
• A method may be converted to a function, but a function cannot be converted to a method.
• Methods can have type parameterization, whereas functions do not.
• Methods can have parameter default values, whereas functions can not.
Examples
Anonymous Functions
Anonymous functions are functions that are defined but not assigned a name.
The following is an anonymous function that takes in two integers and returns the sum.
Anonymous functions are primarily used as arguments to other functions. For instance, the map
function on a collection expects another function as its argument:
The types of the arguments of the anonymous function can be omitted: the types are inferred
automatically:
If there is just one argument, the parentheses around that argument can be omitted:
https://riptutorial.com/ 63
Seq("Foo", "Bar", "Qux").map(x => x.toUpperCase)
Underscores shorthand
There is an even shorter syntax that doesn't require names for the arguments. The above snippet
can be written:
_ represents the anonymous function arguments positionally. With an anonymous function that has
multiple parameters, each occurrence of _ will refer to a different argument. For instance, the two
following expressions are equivalent:
When using this shorthand, any argument represented by the positional _ can only be referenced
a single time and in the same order.
Composition
Function composition allows for two functions to operate and be viewed as a single function.
Expressed in mathematical terms, given a function f(x) and a function g(x), the function h(x) =
f(g(x)).
When a function is compiled, it is compiled to a type related to Function1. Scala provides two
methods in the Function1 implementation related to composition: andThen and compose. The compose
method fits with the above mathematical definition like so:
https://riptutorial.com/ 64
val g: B => C = ...
A new anonymous function is allocated with that is closed over f and g. This function is bound to
the new function h in both cases.
If either f or g works via a side-effect, then calling h will cause all side-effects of f and g to happen
in the order. The same is true of any mutable state changes.
Relationship to PartialFunctions
To define a partial function (which is also a function), use the following syntax:
https://riptutorial.com/ 65
Chapter 16: Futures
Examples
Creating a Future
import scala.concurrent.Future
import scala.concurrent.ExecutionContext.Implicits.global
object FutureDivider {
def divide(a: Int, b: Int): Future[Int] = Future {
// Note that this is integer division.
a / b
}
}
Quite simply, the divide method creates a Future that will resolve with the quotient of a over b.
The easiest way to consume a successful Future-- or rather, get the value inside the Future-- is to
use the map method. Suppose some code calls the divide method of the FutureDivider object from
the "Creating a Future" example. What would the code need to look like to get the quotient of a
over b?
object Calculator {
def calculateAndReport(a: Int, b: Int) = {
val eventualQuotient = FutureDivider divide(a, b)
eventualQuotient map {
quotient => println(quotient)
}
}
}
Sometimes the computation in a Future can create an exception, which will cause the Future to
fail. In the "Creating a Future" example, what if the calling code passed 55 and 0 to the divide
method? It'd throw an ArithmeticException after trying to divide by zero, of course. How would that
be handled in consuming code? There are actually a handful of ways to deal with failures.
object Calculator {
def calculateAndReport(a: Int, b: Int) = {
val eventualQuotient = FutureDivider divide(a, b)
eventualQuotient recover {
case ex: ArithmeticException => println(s"It failed with: ${ex.getMessage}")
https://riptutorial.com/ 66
}
}
}
Handle the exception with the failed projection, where the exception becomes the value of the
Future:
object Calculator {
def calculateAndReport(a: Int, b: Int) = {
val eventualQuotient = FutureDivider divide(a, b)
// Note the use of the dot operator to get the failed projection and map it.
eventualQuotient.failed.map {
ex => println(s"It failed with: ${ex.getMessage}")
}
}
}
The previous examples demonstrated the individual features of a Future, handling success and
failure cases. Usually, however, both features are handled much more tersely. Here's the
example, written in a neater and more realistic way:
object Calculator {
def calculateAndReport(a: Int, b: Int) = {
val eventualQuotient = FutureDivider divide(a, b)
eventualQuotient map {
quotient => println(s"Quotient: $quotient")
} recover {
case ex: ArithmeticException => println(s"It failed with: ${ex.getMessage}")
}
}
}
In general sequence is a commonly known operator within the world of functional programming that
transforms F[G[T]] into G[F[T]] with restrictions to F and G.
There is an alternate operator called traverse, which works similar but takes a function as an extra
argument. With the identity function x => x as a parameter it behaves like the sequence operator.
https://riptutorial.com/ 67
def listOfFuture: List[Future[Int]] = List(1,2,3).map(Future(_))
def futureOfList: Future[List[Int]] = Future.traverse(listOfFuture)(x => x)
However, the extra argument allows to modify each future instance inside the given listOfFuture.
Furthermore, the first argument doesn't need to be a list of Future. Therefore it is possible to
transform the example as follows:
In this case the List(1,2,3) is directly passed as first argument and the identity function x => x is
replaced with the function Future(_) to similarly wrap each Int value into a Future. An advantage of
this is that the intermediary List[Future[Int]] can be omitted to improve performance.
The for comprehension is a compact way to run a block of code that depends on the successful
result of multiple futures.
With f1, f2, f3 three Future[String]'s that will contain the strings one, two, three respectively,
val fCombined =
for {
s1 <- f1
s2 <- f2
s3 <- f3
} yield (s"$s1 - $s2 - $s3")
Also, keep in mind that for comprehension is just a syntactic sugar for a flatMap method, so Future
objects construction inside for body would eliminate concurrent execution of code-blocks enclosed
by futures and lead to sequential code. You see it on example:
https://riptutorial.com/ 68
Thread.sleep(1000)
System.currentTimeMillis()
}
val result2 = for {
first <- fut1
second <- fut2
} yield first - second
Value enclosed by result1 object would be always negative while result2 would be positive.
For more details about the for comprehension and yield in general, see http://docs.scala-
lang.org/tutorials/FAQ/yield.html
https://riptutorial.com/ 69
Chapter 17: Handling units (measures)
Syntax
• class Meter(val meters: Double) extends AnyVal
• type Meter = Double
Remarks
It is recommended to use value classes for units or a dedicated library for them.
Examples
Type aliases
This simple approach has serious drawbacks for unit handling as every other type that is a Double
will be compatible with it:
All of the above compiles, so in this case units can only be used for marking input/output types for
the readers of the code (only the intent).
Value classes
Value classes provide a type-safe way to encode units, even if they require a bit more characters
to use them:
By extending AnyVals, there is no runtime penalty for using them, on the JVM level, those are
regular primitive types (Doubles in this case).
In case you want to automatically generate other units (like Velocity aka MeterPerSecond), this
https://riptutorial.com/ 70
approach is not the best, though there are libraries that can be used in those cases too:
• Squants
• units
• ScalaQuantity
https://riptutorial.com/ 71
Chapter 18: Higher Order Function
Remarks
Scala goes to great lengths to treat methods and functions as syntactically identical. But under the
hood, they are distinct concepts.
A function is an actual object instance of type Function1 (or a similar type of another arity). Its code
is contained in its apply method. Effectively, it simply acts as a value that can be passed around.
Incidentally, the ability to treat functions as values is exactly what is meant by a language having
support for higher-order functions. Function instances are Scala's approach to implementing this
feature.
An actual higher-order function is a function that either takes a function value as an argument or
returns a function value. But in Scala, as all operations are methods, it's more general to think of
methods that receive or return function parameters. So map, as defined on Seq might be thought of
as a "higher-order function" due to its parameter being a function, but it is not literally a function; it
is a method.
Examples
Using methods as function values
The Scala compiler will automatically convert methods into function values for the purpose of
passing them into higher-order functions.
object MyObject {
def mapMethod(input: Int): String = {
int.toString
}
}
In the example above, MyObject.mapMethod is not a function call, but instead is passed to map as a
value. Indeed, map requires a function value passed to it, as can be seen in it's signature. The
signature for the map of a List[A] (a list of objects of type A) is:
The f: (A) => B part indicates that the parameter to this method call is some function that takes an
object of type A and returns an object of type B. A and B are arbitrarily defined. Returning to the first
example, we can see that mapMethod takes an Int (which corresponds to A) and returns a String
(which corresponds to B). Thus mapMethod is a valid function value to pass to map. We could rewrite
https://riptutorial.com/ 72
the same code like this:
This inlines the function value, which may add clarity for simple functions.
A higher-order function, as opposed to a first-order function, can have one of three forms:
• Both of the above: One or more of its parameters is a function, and it returns a function.
object HOF {
def main(args: Array[String]) {
val list =
List(("Srini","E"),("Subash","R"),("Ranjith","RK"),("Vicky","s"),("Sudhar","s"))
//HOF
val fullNameList= list.map(n => getFullName(n._1, n._2))
Here the map function takes a getFullName(n._1,n._2) function as a parameter. This is called HOF
(Higher order function).
Scala supports lazy evaluation for function arguments using notation: def func(arg: => String).
Such function argument might take regular String object or a higher order function with String
return type. In second case, function argument would be evaluated on value access.
https://riptutorial.com/ 73
def smartMediator(preconditions: Boolean, data: => String): Option[String] = {
print("Applying mediator")
preconditions match {
case true => Some(data)
case false => None
}
}
smartMediator call would return None value and print message "Applying mediator".
dumbMediator call would return None value and print message "Calculating expensive data!
Applying mediator".
Lazy evaluation might be extremely useful when you want to optimize an overhead of expensive
arguments calculation.
https://riptutorial.com/ 74
Chapter 19: If Expressions
Examples
Basic If Expressions
In Scala (in contrast to Java and most other languages), if is an expression instead of a
statement. Regardless, the syntax is identical:
The implication of if being an expression is that you can assign the result of the evalation of the
expression to a variable:
val result = if(x > 0) "Greater than 0" else "Less than or equals 0"
\\ result: String = Greater than 0
Above we see that the if expression is evaluated and result is set to that resulting value.
The return type of an if expression is the supertype of all logic branches. This means that for this
example the return type is a String. Since not all if expressions return a value (such as an if
statement that has no else branch logic), it is possible that the return type is Any:
If no value can be returned (such as if only side effects like println are used inside the logical
branches), then the return type will be Unit:
if expressions in Scala are similar to how the ternary operator in Java functions. Because of this
similarity, there is no such operator in Scala: it would be redundant.
https://riptutorial.com/ 75
Chapter 20: Implicits
Syntax
• implicit val x: T = ???
Remarks
Implicit classes allow custom methods to be added to existing types, without having to modify their
code, thereby enriching types without needing control of the code.
Using implicit types to enrich an existing class is often referred to as an 'enrich my library' pattern.
1. Implicit classes may only exist within another class, object, or trait.
2. Implicit classes may only have one non-implicit primary constructor parameter.
3. There may not be another object, class, trait, or class member definition within the same
scope that has the same name as the implicit class.
Examples
Implicit Conversion
An implicit conversion allows the compiler to automatically convert an object of one type to another
type. This allows the code to treat an object as an object of another type.
The conversion is one-way: in this case you cannot convert 42 back to Foo(42). To do so, a second
implicit conversion must be defined:
Note that this is the mechanism by which a float value can be added to an integer value, for
instance.
https://riptutorial.com/ 76
happening. It is a best practice to use an explicit conversion via a method call unless
there's a tangible readability gain from using an implicit conversion.
Implicit Parameters
Implicit parameters can be useful if a parameter of a type should be defined once in the scope and
then applied to all functions that use a value of that type.
// a normal method:
def doLongRunningTask(timeout: FiniteDuration): Long = timeout.toMillis
// to call it
doLongRunningTask(timeout) // 1000
Now lets say we have some methods that all have a timeout duration, and we want to call all those
methods using the same timeout. We can define timeout as an implicit variable.
// we can now call the functions without passing the timeout parameter
doLongRunningTaskA() // 1000
doLongRunningTaskB() // 1000
The way this works is that the scalac compiler looks for a value in the scope which is marked as
implicit and whose type matches the one of the implicit parameter. If it finds one, it will apply it
as the implicit parameter.
Note that this won't work if you define two or even more implicits of the same type in
the scope.
To customize the error message, use the implicitNotFound annotation on the type:
https://riptutorial.com/ 77
@annotation.implicitNotFound(msg = "Select the proper implicit value for type M[${A}]!")
case class M[A](v: A) {}
//Does not work because no implicit value is present for type `M[Int]`
//usage[Int] //Select the proper implicit value for type M[Int]!
implicit val first: M[Int] = M(1)
usage[Int] //Works when `second` is not in scope
implicit val second: M[Int] = M(2)
//Does not work because more than one implicit values are present for the type `M[Int]`
//usage[Int] //Select the proper implicit value for type M[Int]!
A timeout is a usual use case for this, or for example in Akka the ActorSystem is (most of the
times) always the same, so it's usually passed implicitly. Another use case would be library
design, most commonly with FP libraries that rely on typeclasses (like scalaz, cats or rapture).
It's generally considered bad practice to use implicit parameters with basic types like
Int, Long, String etc. since it will create confusion and make the code less readable.
Implicit Classes
Implicit classes make it possible to add new methods to previously defined classes.
The String class has no method withoutVowels. This can be added like so:
object StringUtil {
implicit class StringEnhancer(str: String) {
def withoutVowels: String = str.replaceAll("[aeiou]", "")
}
}
The implicit class has a single constructor parameter (str) with the type that you would like to
extend (String) and contains the method you would like to "add" to the type (withoutVowels). The
newly defined methods can now be used directly on the enhanced type (when the enhanced type
is in implicit scope):
Under the hood, implicit classes define an implicit conversion from the enhanced type to the
implicit class, like this:
Implicit classes are often defined as Value classes to avoid creating runtime objects and thus
removing the runtime overhead:
https://riptutorial.com/ 78
}
With the above improved definition, a new instance of StringEnhancer doesn't need to be created
every time the withoutVowels method gets invoked.
Assuming an implicit parameter list with more than one implicit parameter:
Now, assuming that one of the implicit instances is not available (SomeCtx1) while all other implicit
instances needed are in-scope, to create an instance of the class an instance of SomeCtx1 must be
provided.
This can be done while preserving each other in-scope implicit instance using the implicitly
keyword:
scala> :implicits
scala> :implicits -v
If one has an expression and wishes to view the effect of all rewrite rules that apply to it (including
implicits):
(Example:
https://riptutorial.com/ 79
Chapter 21: Java Interoperability
Examples
Converting Scala Collections to Java Collections and vice versa
import scala.collection.JavaConverters._
If the Java code returns a Java collection, you can turn it into a Scala collection in a similar
manner:
import scala.collection.JavaConverters._
Note that these are decorators, so they merely wrap the underlying collections in a Scala or Java
collection interface. Therefore, the calls .asJava and .asScala do not copy the collections.
Arrays
Arrays are regular JVM arrays with a twist that they are treated as invariant and have special
constructors and implicit conversions. Construct them without the new keyword.
val a = Array("element")
You can use an Array like other collections, thanks to an implicit conversion to TraversableLike
ArrayOps:
Most of the Scala collections (TraversableOnce) have a toArray method taking an implicit ClassTag to
construct the result array:
https://riptutorial.com/ 80
List(0).toArray
//> res1: Array[Int] = Array(0)
This makes it easy to use any TraversableOnce in your Scala code and then pass it to Java code
which expects an array.
Scala offers implicit conversions between all the major collection types in the JavaConverters
object.
Iterator java.util.Iterator
Iterator java.util.Enumeration
Iterator java.util.Iterable
Iterator java.util.Collection
mutable.Buffer java.util.List
mutable.Set java.util.Set
mutable.Map java.util.Map
mutable.ConcurrentMap java.util.concurrent.ConcurrentMap
Certain other Scala collections can also be converted to Java, but do not have a conversion back
to the original Scala type:
Seq java.util.List
mutable.Seq java.util.List
Set java.util.Set
Map java.util.Map
Reference:
https://riptutorial.com/ 81
A Java 8 compatibility kit for Scala.
import java.util.function._
import scala.compat.java8.FunctionConverters._
import scala.compat.java8.OptionConverters._
class Test {
val o = Option(2.7)
val oj = o.asJava // Optional[Double]
val ojd = o.asPrimitive // OptionalDouble
val ojds = ojd.asScala // Option(2.7) again
}
import java.util.stream.IntStream
import scala.compat.java8.StreamConverters._
import scala.compat.java8.collectionImpl.{Accumulator, LongAccumulator}
https://riptutorial.com/ 82
Chapter 22: JSON
Examples
JSON with spray-json
spray-json provides an easy way to work with JSON. Using implicit formats, everything happens
"behind the scenes":
Note that the last parameter, the version number (1.3.2), may be different in different projects.
import spray.json._
import DefaultJsonProtocol._
The default JSON protocol DefaultJsonProtocol contains formats for all basic types. To provide
JSON functionality for custom types, either use convenience builders for formats or write formats
explicitly.
Read JSON
// generates an intermediate JSON representation (abstract syntax tree)
val res = """{ "foo": "bar" }""".parseJson // JsValue = {"foo":"bar"}
Write JSON
val values = List("a", "b", "c")
values.toJson.prettyPrint // ["a", "b", "c"]
https://riptutorial.com/ 83
DSL
DSL is not supported.
// serialize a Person
Person("Fred", Address("Awesome Street 9", "SuperCity"))
val fredJsonString = fred.toJson.prettyPrint
{
"name": "Fred",
"address": {
"street": "Awesome Street 9",
"city": "SuperCity"
}
}
Custom Format
Write a custom JsonFormat if a special serialization of a type is required. For example, if the field
names are different in Scala than in JSON. Or, if different concrete types are instantiated based on
the input.
https://riptutorial.com/ 84
// serialization code
override def write(person: Person): JsValue = JsObject(
"name" -> person.name.toJson,
"home" -> person.address.toJson
)
}
Circe provides compile-time derived codecs for en/decode json into case classes. A simple
example looks like this:
import io.circe._
import io.circe.generic.auto._
import io.circe.parser._
import io.circe.syntax._
// {"id":1,"name":"John Doe"}
val json = user.asJson.noSpaces
import play.api.libs.json._
import play.api.libs.functional.syntax._ // if you need DSL
DefaultFormat contains defaul formats to read/write all basic types. To provide JSON functionality
for your own types, you can either use convenience builders for formats or write formats explicitly.
Read json
Write json
https://riptutorial.com/ 85
DSL
As always prefer pattern matching against JsSuccess/JsError and try to avoid .get, array(i) calls.
// serialize a Person
val fred = Person("Fred", Address("Awesome Street 9", "SuperCity"))
val fredJsonString = Json.stringify(Json.toJson(Json.toJson(fred)))
Own Format
You can write your own JsonFormat if you require a special serialization of your type (e.g. name
the fields differently in scala and Json or instantiate different concrete types based on the input)
Alternative
If the json doesn't exactly match your case class fields (isAlive in case class vs is_alive in json):
https://riptutorial.com/ 86
case class User(username: String, friends: Int, enemies: Int, isAlive: Boolean)
object User {
import play.api.libs.functional.syntax._
import play.api.libs.json._
case class User(username: String, friends: Int, enemies: Int, isAlive: Option[Boolean])
object User {
import play.api.libs.functional.syntax._
import play.api.libs.json._
{
"field": "example field",
"date": 1459014762000
}
solution:
Now, if you do wrap your object identifiers for type safety, you will enjoy this. See the following
json object:
https://riptutorial.com/ 87
{
"id": 91,
"data": "Some data"
}
Now you just need to read the primitive type (Long), and map to your idenfier:
object JsonExampleV2 {
implicit val r: Reads[JsonExampleV2] = (
(__ \ "id").read[Long].map(MyIdentifier) and
(__ \ "data").read[String]
)(JsonExampleV2.apply _)
}
code at https://github.com/pedrorijo91/scala-play-json-examples
SBT dependency:
Imports
import org.json4s.JsonDSL._
import org.json4s._
import org.json4s.native.JsonMethods._
Read json
Write json
https://riptutorial.com/ 88
compact(render(values)) // ["a", "b", "c"]
DSL
To serialize and deserialize an heterogenous (or polymorphic) list, specific type-hints need to be
provided.
trait Location
case class Street(name: String) extends Location
case class City(name: String, zipcode: String) extends Location
case class Address(street: Street, city: City) extends Location
case class Locations (locations : List[Location])
read[Locations](locationsString)
Own Format
https://riptutorial.com/ 89
// Address(Awesome Stree,Super City)
https://riptutorial.com/ 90
Chapter 23: Macros
Introduction
Macros are a form of compile time metaprogramming. Certain elements of Scala code, such as
annotations and methods, can be made to transform other code when they are compiled. Macros
are ordinary Scala code that operate on data types that represent other code. The [Macro
Paradise][] plugin extends the abilities of macros beyond the base language. [Macro Paradise]:
http://docs.scala-lang.org/overviews/macros/paradise.html
Syntax
• def x() = macro x_impl // x is a macro, where x_impl is used to transform code
• def macroTransform(annottees: Any*): Any = macro impl // Use in annotations to make them
macros
Remarks
Macros are a language feature that need to be enabled, either by importing scala.language.macros
or with the compiler option -language:macros. Only macro definitions require this; code that uses
macros need not do it.
Examples
Macro Annotation
object linkMacro {
def impl(c: Context)(annottees: c.Expr[Any]*): c.Expr[Any] = {
import c.universe._
c.Expr[Any](q"{..$annottees}")
}
}
The @compileTimeOnly annotation generates an error with a message indicating that the paradise
compiler plugin must be included to use this macro. Instructions to include this via SBT are here.
https://riptutorial.com/ 91
You can use the above-defined macro like this:
@noop
case class Foo(a: String, b: Int)
@noop
object Bar {
def f(): String = "hello"
}
@noop
def g(): Int = 10
Method Macros
When a method is defined to be a macro, the compiler takes the code that is passed as its
argument and turns it into an AST. It then invokes the macro implementation with that AST, and it
returns a new AST that is then spliced back to its call site.
import reflect.macros.blackbox.Context
object Macros {
// This macro simply sees if the argument is the result of an addition expression.
// E.g. isAddition(1+1) and isAddition("a"+1).
// but !isAddition(1+1-1), as the addition is underneath a subtraction, and also
// !isAddition(x.+), and !isAddition(x.+(a,b)) as there must be exactly one argument.
def isAddition(x: Any): Boolean = macro isAddition_impl
// The signature of the macro implementation is the same as the macro definition,
// but with a new Context parameter, and everything else is wrapped in an Expr.
def isAddition_impl(c: Context)(expr: c.Expr[Any]): c.Expr[Boolean] = {
import c.universe._ // The universe contains all the useful methods and types
val plusName = TermName("+").encodedName // Take the name + and encode it as $plus
expr.tree match { // Turn expr into an AST representing the code in isAddition(...)
case Apply(Select(_, `plusName`), List(_)) => reify(true)
// Pattern match the AST to see whether we have an addition
// Above we match this AST
// Apply (function application)
// / \
// Select List(_) (exactly one argument)
// (selection ^ of entity, basically the . in x.y)
// / \
// _ \
// `plusName` (method named +)
case _ => reify(false)
// reify is a macro you use when writing macros
// It takes the code given as its argument and creates an Expr out of it
}
}
}
It is also possible to have macros that take Trees as arguments. Like how reify is used to create
Exprs, the q (for quasiquote) string interpolator lets us create and deconstruct Trees. Note that we
could have used q above (expr.tree is, surprise, a Tree itself) too, but didn't for demonstrative
purposes.
https://riptutorial.com/ 92
// No Exprs, just Trees
def isAddition_impl(c: Context)(tree: c.Tree): c.Tree = {
import c.universe._
tree match {
// q is a macro too, so it must be used with string literals.
// It can destructure and create Trees.
// Note how there was no need to encode + this time, as q is smart enough to do it itself.
case q"${_} + ${_}" => q"true"
case _ => q"false"
}
}
Errors in Macros
Macros can trigger compiler warnings and errors through the use of their Context.
Say we're a particularly overzealous when it comes to bad code, and we want to mark every
instance of technical debt with a compiler info message (let's not think about how bad this idea is).
We can use a macro that does nothing except emit such a message.
import reflect.macros.blackbox.Context
Additionally, instead of using ??? to mark unimplemented code, we can create two macros, !!! and
?!?, that serve the same purpose, but emit compiler warnings. ?!? will cause a warning to be
issued, and !!! will cause an outright error.
import reflect.macros.blackbox.Context
https://riptutorial.com/ 93
def impl_!!!(c: Context): c.Tree = {
import c.universe._
c.error(c.enclosingPosition, "Unimplemented!")
q"${termNames.ROOTPKG}.scala.Predef.???"
}
https://riptutorial.com/ 94
Chapter 24: Monads
Examples
Monad Definition
Informally, a monad is a container of elements, notated as F[_], packed with 2 functions: flatMap
(to transform this container) and unit (to create this container).
Formal definition
Monad M is a parametric type M[T] with two operations flatMap and unit, such as:
trait M[T] {
def flatMap[U](f: T => M[U]): M[U]
}
Example:
val m = List(1, 2, 3)
def unit(x: Int): List[Int] = List(x)
def f(x: Int): List[Int] = List(x * x)
def g(x: Int): List[Int] = List(x * x * x)
val x = 1
1. Associativity:
https://riptutorial.com/ 95
2. Left unit
3. Right unit
Most of the standard collections are monads (List[T], Option[T]), or monad-like (Either[T],
Future[T]). These collections can be easily combined together within for comprehensions (which
are an equivalent way of writing flatMap transformations):
val a = List(1, 2, 3)
val b = List(3, 4, 5)
for {
i <- a
j <- b
} yield(i * j)
a flatMap {
i => b map {
j => i * j
}
}
Because a monad preserves the data structure and only acts on the elements within that structure,
we can endless chain monadic datastructures, as shown here in a for-comprehension.
https://riptutorial.com/ 96
Chapter 25: Operator Overloading
Examples
Defining Custom Infix Operators
In Scala operators (such as +, -, *, ++, etc.) are just methods. For instance, 1 + 2 can be written as
1.+(2). These sorts of methods are called 'infix operators'.
This means custom methods can be defined on your own types, reusing these operators:
Note that infix operators can only have a single argument; the object before the operator will call
it's own operator on the object after the operator. Any Scala method with a single argument can be
used as an infix operator.
This should be used with parcimony. It is generally considered good practice only if
your own method does exactly what one would expect from that operator. In case of
doubt, use a more conservative naming, like add instead of +.
Unary operators can be defined by prepending the operator with unary_. Unary operators are
limited to unary_+, unary_-, unary_! and unary_~:
def unary_- = {
https://riptutorial.com/ 97
val newData = for (r <- 0 until rows) yield
for (c <- 0 until cols) yield this.data(r)(c) * -1
This should be used with parcimony. Overloading a unary operator with a definition that
is not what one would expect can lead to code confusion.
https://riptutorial.com/ 98
Chapter 26: Operators in Scala
Examples
Built-in Operators
Scala has the following built-in operators (methods/language elements with predefined
precedence rules):
Note: methods ending with : bind to the right (and right associative), so the call with
list.::(value) can be written as value :: list with operator syntax. (1 :: 2 :: 3 :: Nil is the
same as 1 :: (2 :: (3 :: Nil)))
Operator Overloading
class Team {
def +(member: Person) = ...
}
ITTeam + Jack
or
ITTeam.+(Jack)
To define unary operators you can prefix it with unary_. E.g. unary_!
class MyBigInt {
https://riptutorial.com/ 99
def unary_! = ...
}
Operator Precedence
Programming in Scala gives the following outline based on the 1st character in the operator. E.g. >
is the 1st character in the operator >>>:
Operator
*/%
+-
https://riptutorial.com/ 100
Operator
=!
<>
&
(all letters)
The one exception to this rule concerns assignment operators, e.g. +=, *=, etc. If an operator ends
with an equal character (=) and is not one of the comparison operators <=, >=, == or !=, then the
precedence of the operator is the same as simple assignment. In other words, lower than that of
any other operator.
https://riptutorial.com/ 101
Chapter 27: Option Class
Syntax
• class Some[+T](value: T) extends Option[T]
• Option[T](value: T)
Constructor to create either a Some(value) or None as appropriate for the value provided.
Examples
Options as Collections
Options have some useful higher-order functions that can be easily understood by viewing options
as collections with zero or one items - where None behaves like the empty collection, and Some(x)
behaves like a collection with a single item, x.
In Java (and other languages), using null is a common way of indicating that there is no value
attached to a reference variable. In Scala, using Option is preferred over using null. Option wraps
values that might be null.
None is a subclass of Option wrapping a null reference. Some is a subclass of Option wrapping a non-
null reference.
This is typical code when calling a Java library that might return a null reference:
https://riptutorial.com/ 102
If getResource() returns a null value, resource will be a None object. Otherwise it will be a
Some(resource) object. The preferred way to handle an Option is using higher order functions
available within the Option type. For example if you want to check if your value is not None (similar
to checking if value == null), you would use the isDefined function:
It is preferred that you treat conditional execution on the wrapped value of an Option (without using
the 'exceptional' Option.get method) by treating the Option as a monad and using foreach:
If a Resource instance is required (versus an Option[Resource] instance), you can still use Option to
protect against null values. Here the getOrElse method provides a default value:
Java code won't readily handle Scala's Option, so when passing values to Java code it is good
form to unwrap an Option, passing null or a sensible default where appropriate:
Basics
An Option is a data structure that contains either a single value, or no value at all. An Option can be
thought of as collections of zero or one elements.
Option is useful in expressions that would otherwise use null to represent the lack of a concrete
https://riptutorial.com/ 103
value. This protects against a NullPointerException, and allows the composition of many
expressions that might not return a value using combinators such as Map, FlatMap, etc.
println(countries.get("USA")) // Some(Washington)
println(countries.get("France")) // None
println(countries.get("USA").get) // Washington
println(countries.get("France").get) // Error: NoSuchElementException
println(countries.get("USA").getOrElse("Nope")) // Washington
println(countries.get("France").getOrElse("Nope")) // Nope
Option[A] is sealed and thus cannot be extended. Therefore it's semantics are stable and can be
relied on.
Optionshave a flatMap method. This means they can be used in a for comprehension. In this way
we can lift regular functions to work on Options without having to redefine them.
When one of the values is a None the ending result of the calculation will be None as well.
Note: this pattern extends more generally for concepts called Monads. (More information should be
available on pages relating to for comprehensions and Monads)
In general it is not possible to mix different monads in a for comprehension. But since Option can
be easily converted to an Iterable, we can easily mix Options and Iterables by calling the
https://riptutorial.com/ 104
.toIterable method.
A small note: if we had defined our for comprehension the other way around the for
comprehension would compile since our option would be converted implicitly. For that reason it is
useful to always add this .toIterable (or corresponding function depending on which collection you
are using) for consistency.
https://riptutorial.com/ 105
Chapter 28: Packages
Introduction
Packages in Scala manage namespaces in large programs. For example, the name connection
can occur in the packages com.sql and org.http. You can use the fully qualified com.sql.connection
and org.http.connection, respectively, in order to access each of these packages.
Examples
Package structure
package com {
package utility {
package serialization {
class Serializer
...
}
}
}
The package clause is not directly binded with the file where it is found. It is possible to find
common elements of the package clause in diferent files. For example, the package clauses
bellow can be found in the file math1.scala and in the file math2.scala.
File math1.scala
package org {
package math {
package statistics {
class Interval
}
}
}
File math2.scala
package org {
package math{
package probability {
class Density
}
}
}
File study.scala
https://riptutorial.com/ 106
import org.math.probability.Density
import org.math.statistics.Interval
object Study {
io.super.math
https://riptutorial.com/ 107
Chapter 29: Parallel Collections
Remarks
Parallel collections facilitate parallel programming by hiding low-level parallelization details. This
makes taking advantage of multi-core architectures easy. Examples of parallel collections include
ParArray, ParVector, mutable.ParHashMap, immutable.ParHashMap, and ParRange. A full list can be found
in the documentation.
Examples
Creating and Using Parallel Collections
To create a parallel collection from a sequential collection, call the par method. To create a
sequential collection from a parallel collection, call the seq method. This example shows how you
turn a regular Vector into a ParVector, and then back again:
scala> parVect.seq
res0: scala.collection.immutable.Vector[Int] = Vector(1, 2, 3, 4, 5)
The par method can be chained, allowing you to convert a sequential collection to a parallel
collection and immediately perform an action on it:
scala> vect.map(_ * 2)
res1: scala.collection.immutable.Vector[Int] = Vector(2, 4, 6, 8, 10)
scala> vect.par.map(_ * 2)
res2: scala.collection.parallel.immutable.ParVector[Int] = ParVector(2, 4, 6, 8, 10)
In these examples, the work is actually parceled out to multiple processing units, and then re-
joined after the work is complete - without requiring developer intervention.
Pitfalls
Do not use parallel collections when the collection elements must be received in a specific
order.
Parallel collections perform operations concurrently. That means that all of the work is divided into
parts and distributed to different processors. Each processor is unaware of the work being done by
others. If the order of the collection matters then work processed in parallel is nondeterministic.
(Running the same code twice can yield different results.)
https://riptutorial.com/ 108
Non-associative Operations
If an operation is non-associative (if the order of execution matters), then the result on a
parallelized collection will be nondeterministic.
scala> list.reduce(_ - _)
res0: Int = -500498
scala> list.reduce(_ - _)
res1: Int = -500498
scala> list.reduce(_ - _)
res2: Int = -500498
scala> listPar.reduce(_ - _)
res3: Int = -408314
scala> listPar.reduce(_ - _)
res4: Int = -422884
scala> listPar.reduce(_ - _)
res5: Int = -301748
Side Effects
Operations that have side effects, such as foreach, may not execute as desired on parallelized
collections due to race conditions. Avoid this by using functions that have no side effects, such as
reduce or map.
https://riptutorial.com/ 109
Chapter 30: Parser Combinators
Remarks
ParseResult Cases
• Success, with a marker as to the start of the match and the next character to be matched.
• Failure, with a marker as to the start of where the match was attempted. In this case the
parser backtracks to that position, where it will be when parsing continues.
• Error, which stops the parsing. No backtracking or further parsing occurs.
Examples
Basic Example
import scala.util.parsing.combinator._
[1.1] failure: string matching regex `[A-Z][a-z]+' expected but `b' found
[1.6]in the Alice example indicates that the start of the match is at position 1, and the fist
character remaining to match starts at position 6.
https://riptutorial.com/ 110
Chapter 31: Partial Functions
Examples
Composition
In this usage, the partial functions are attempted in order of concatenation with the orElse method.
Typically, a final partial function is provided that matches all remaining cases. Collectively, the
combination of these functions acts as a total function.
This pattern is typically used to separate concerns where a function may effectively act a
dispatcher for disparate code paths. This is common, for example, in the receive method of an
Akka Actor.
While partial function are often used as convenient syntax for total functions, by including a final
wildcard match (case _), in some methods, their partiality is key. One very common example in
idiomatic Scala is the collect method, defined in the Scala collections library. Here, partial
functions allow the common functions of examining the elements of a collection to map and/or filter
them to occur in one compact syntax.
Example 1
https://riptutorial.com/ 111
We can invoke it with the collect combinator:
List(-1.1,2.2,3.3,0).collect(sqRoot)
List(-1.1,2.2,3.3,0).filter(sqRoot.isDefinedAt).map(sqRoot)
Example 2
sealed trait SuperType // `sealed` modifier allows inheritance within current build-unit only
case class A(value: Int) extends SuperType
case class B(text: String) extends SuperType
case object C extends SuperType
input.collect {
case A(value) if value < 10 => value.toString
case B(text) if text.nonEmpty => text
} // Seq("5", "hello")
• The left-hand side of each pattern match effectively selects elements to process and include
in the output. Any value that doesn't have a matching case is simply omitted.
• The right-hand side defines the case-specific processing to apply.
• Pattern matching binds variable for use in guard statements (the if clauses) and the right-
hand side.
Basic syntax
Scala has a special type of function called a partial function, which extends normal functions --
meaning that a PartialFunction instance can be used wherever Function1 is expected. Partial
functions can be defined anonymously using case syntax also used in pattern matching:
As seen in the example, a partial function need not be defined over the whole domain of its first
parameter. A standard Function1 instance is assumed to be total, meaning that it is defined for
every possible argument.
https://riptutorial.com/ 112
Partial functions are very common in idiomatic Scala. They are often used for their convenient case
-based syntax to define total functions over traits:
sealed trait SuperType // `sealed` modifier allows inheritance within current build-unit only
case object A extends SuperType
case object B extends SuperType
case object C extends SuperType
input.map {
case A => 5
case _ => 10
} // Seq(5, 10, 10)
This saves the additional syntax of a match statement in a regular anonymous function. Compare:
It is also frequently used to perform a parameter decomposition using pattern matching, when a
tuple or a case class is passed to a function:
These three map functions are equivalent, so use the variation that your team finds most readable.
// 1. No extraction
numberNames.map(it => s"${it._1} is written ${it._2}" )
The partial function must match all input: any case which doesn't match will throw an exception
at runtime.
https://riptutorial.com/ 113
Read Partial Functions online: https://riptutorial.com/scala/topic/1638/partial-functions
https://riptutorial.com/ 114
Chapter 32: Pattern Matching
Syntax
• selector match partialFunction
• selector match {list of case alternatives) // This is most common form of the above
Parameters
Parameter Details
Examples
Simple Pattern Match
f(2) // "Two"
f(3) // "Unknown!"
Live demo
g(1) // "One"
g(3) // throws a MatchError
https://riptutorial.com/ 115
It is also possible to match against values that are not defined inline. These must be stable
identifiers, which are obtained by either using a capitalized name or enclosing backticks.
Unlike other programming languages as Java for example there is no fall through. If a case block
matches an input, it gets executed and the matching is finished. Therefore the least specific case
should be the last case block.
f(5) // "Default"
f(1) // "Default"
In standard pattern matching, the identifier used will shadow any identifier in the enclosing scope.
Sometimes it is necessary to match on the enclosing scope's variable.
The following example function takes a character and a list of tuples and returns a new list of
tuples. If the character existed as the first element in one of the tuples, the second element is
incremented. If it does not yet exist in the list, a new tuple is created.
def tabulate(char: Char, tab: List[(Char, Int)]): List[(Char, Int)] = tab match {
case Nil => List((char, 1))
case (`char`, count) :: tail => (char, count + 1) :: tail
case head :: tail => head :: tabulate(char, tail)
}
The above demonstrates pattern matching where the method's input, char, is kept 'stable' in the
pattern match: that is, if you call tabulate('x', ...), the first case statement would be interpreted
as:
Scala will interpret any variable demarcated with a tick mark as a stable identifier: it will also
interpret any variable that starts with a capital letter in the same way.
https://riptutorial.com/ 116
Pattern Matching on a Seq
Live demo
In general, any form that can be used to construct a sequence can be used to pattern match
against an existing sequence.
Note that while using Nil and :: will work when pattern matching a Sequence, it does convert it to
a List, and can have unexpected results. Constrain yourself to Seq( ...) and +: to avoid this.
Note that while using :: will not work for WrappedArray, Vector etc, see:
scala> f(Array(1,2))
res0: Any = No match
https://riptutorial.com/ 117
And with +:
scala> g(Array(1,2).toSeq)
res4: Any = 1
Case statements can be combined with if expressions to provide extra logic when pattern
matching.
It is important to ensure your guards do not create a non-exhaustive match (the compiler often will
not catch this):
This throws a MatchError on odd numbers. You must either account for all cases, or use a wildcard
match case:
Every case class defines an extractor that can be used to capture the members of the case class
when pattern matching:
https://riptutorial.com/ 118
All the normal rules of pattern-matching apply - you can use guards and constant expressions to
control matching:
Matching on an Option
When pattern matching an object whose type is a sealed trait, Scala will check at compile-time
that all cases are 'exhaustively matched':
If a new case class for Shape is later added, all match statements on Shape will start to throw a
compiler warning. This makes thorough refactoring easier: the compiler will alert the developer to
all code that needs to be updated.
https://riptutorial.com/ 119
Pattern Matching with Regex
"name@example.com" match {
case emailRegex(userName, domain, topDomain) => println(s"Hi $userName from $domain")
case _ => println(s"This is not a valid email.")
}
In this example, the regex attempts to match the email address provided. If it does, the userName
and domain is extracted and printed. topDomain is also extracted, but nothing is done with it in this
example. Calling .r on a String str is equivalent to new Regex(str). The r function is available via
an implicit conversion.
The @ sign binds a variable to a name during a pattern match. The bound variable can either be
the entire matched object or part of the matched object:
https://riptutorial.com/ 120
Pattern Matching Types
Pattern matching can also be used to check the type of an instance, rather than using
isInstanceOf[B]:
anyRef match {
case _: Number => "It is a number"
case _: String => "It is a string"
case _: CharSequence => "It is a char sequence"
}
//> res0: String = It is a string
anyRef match {
case _: Number => "It is a number"
case _: CharSequence => "It is a char sequence"
case _: String => "It is a string"
}
//> res1: String = It is a char sequence
In this manner it is similar to a classical 'switch' statement, without the fall-through functionality.
However, you can also pattern match and 'extract' values from the type in question. For instance:
Note that in the Foo and Woo case we use the underscore (_) to 'match an unbound variable'. That is
to say that the value (in this case Hadas and 27, respectively) is not bound to a name and thus is
not available in the handler for that case. This is useful shorthand in order to match 'any' value
without worrying about what that value is.
The @switch annotation tells the compiler that the match statement can be replaced with a single
tableswitch instruction at the bytecode level. This is a minor optimization that can remove
https://riptutorial.com/ 121
unnecessary comparisons and variable loads during runtime.
The @switch annotation works only for matches against literal constants and final val identifiers. If
the pattern match cannot be compiled as a tableswitch/lookupswitch, the compiler will raise a
warning.
import annotation.switch
scala> suffix(2)
res1: String = "2nd"
scala> suffix(4)
res2: String = "4th"
The | can be used to have a single case statement match against multiple inputs to yield the same
result:
Note that while matching values this way works well, the following matching of types will cause
problems:
https://riptutorial.com/ 122
sealed class FooBar
case class Foo(s: String) extends FooBar
case class Bar(s: String) extends FooBar
val d = Foo("Diana")
val h = Bar("Hadas")
If in the latter case (with _) you don't need the value of the unbound variable and just want to do
something else, you're fine:
https://riptutorial.com/ 123
The first case shows how to match against a specific string and get the corresponding price. The
second case shows a use of if and tuple extraction to match against elements of the tuple.
https://riptutorial.com/ 124
Chapter 33: Quasiquotes
Examples
Create a syntax tree with quasiquotes
object macro {
def addCreationDate(): java.util.Date = macro impl.addCreationDate
}
object impl {
def addCreationDate(c: Context)(): c.Expr[java.util.Date] = {
import c.universe._
It can be arbitrarily complex but it will be validated for correct scala syntax.
https://riptutorial.com/ 125
Chapter 34: Recursion
Examples
Tail Recursion
Using regular recursion, each recursive call pushes another entry onto the call stack. When the
recursion is completed, the application has to pop each entry off all the way back down. If there
are much recursive function calls it can end up with a huge stack.
Scala automatically removes the recursion in case it finds the recursive call in tail position. The
annotation (@tailrec) can be added to recursive functions to ensure that tail call optimization is
performed. The compiler then shows an error message if it can't optimize your recursion.
Regular Recursion
This example is not tail recursive because when the recursive call is made, the function needs to
keep track of the multiplication it needs to do with the result after the call returns.
println(fact(5))
The function call with the parameter will result in a stack that looks like this:
(fact 5)
(* 5 (fact 4))
(* 5 (* 4 (fact 3)))
(* 5 (* 4 (* 3 (fact 2))))
(* 5 (* 4 (* 3 (* 2 (fact 1)))))
(* 5 (* 4 (* 3 (* 2 (* 1 (fact 0))))))
(* 5 (* 4 (* 3 (* 2 (* 1 * 1)))))
(* 5 (* 4 (* 3 (* 2))))
(* 5 (* 4 (* 6)))
(* 5 (* 24))
120
If we try to annotate this example with @tailrec we will get the following error message: could not
optimize @tailrec annotated method fact: it contains a recursive call not in tail position
Tail Recursion
In tail recursion, you perform your calculations first, and then you execute the recursive call,
passing the results of your current step to the next recursive step.
https://riptutorial.com/ 126
def fact_with_tailrec(i : Int) : Long = {
@tailrec
def fact_inside(i : Int, sum: Long) : Long = {
if(i <= 1) sum
else fact_inside(i-1,sum*i)
}
fact_inside(i,1)
}
println(fact_with_tailrec(5))
In contrast, the stack trace for the tail recursive factorial looks like the following:
(fact_with_tailrec 5)
(fact_inside 5 1)
(fact_inside 4 5)
(fact_inside 3 20)
(fact_inside 2 60)
(fact_inside 1 120)
There is only the need to keep track of the same amount of data for every call to fact_inside
because the function is simply returning the value it got right through to the top. This means that
even if fact_with_tail 1000000 is called, it needs only the same amount of space as fact_with_tail
3. This is not the case with the non-tail-recursive call, and as such large values may cause a stack
overflow.
It is very common to get a StackOverflowError error while calling recursive function. Scala standard
library offers TailCall to avoid stack overflow by using heap objects and continuations to store the
local state of the recursion.
import scala.util.control.TailCalls._
https://riptutorial.com/ 127
Read Recursion online: https://riptutorial.com/scala/topic/3889/recursion
https://riptutorial.com/ 128
Chapter 35: Reflection
Examples
Loading a class using reflection
import scala.reflect.runtime.universe._
val mirror = runtimeMirror(getClass.getClassLoader)
val module = mirror.staticModule("org.data.TempClass")
https://riptutorial.com/ 129
Chapter 36: Regular Expressions
Syntax
• re.findAllIn(s: CharSequence): MatchIterator
• re.findAllMatchIn(s: CharSequence): Iterator[Match]
• re.findFirstIn(s: CharSequence): Option[String]
• re.findFirstMatchIn(s: CharSequence): Option[Match]
• re.findPrefixMatchIn(s: CharSequence): Option[Match]
• re.findPrefixOf(s: CharSequence): Option[String]
• re.replaceAllIn(s: CharSequence, replacer: Match => String): String
• re.replaceAllIn(s: CharSequence, replacement: String): String
• re.replaceFirstIn(s: CharSequence, replacement: String): String
• re.replaceSomeIn(s: CharSequence, replacer: Match => Option[String]): String
• re.split(s: CharSequence): Array[String]
Examples
Declaring regular expressions
There is an overloaded version of r, def r(names: String*): Regex which allows you to assign
group names to your pattern captures. This is somewhat brittle as the names are disassociated
from the captures, and should only be used if the regular expression will be used in multiple
locations:
https://riptutorial.com/ 130
val m = matched.group("m").toInt
val d = matched.group("d").toInt
java.time.LocalDate.of(y, m, d)
case None => ???
}
val re = """\((.*?)\)""".r
val str =
"(The)(example)(of)(repeating)(pattern)(in)(a)(single)(string)(I)(had)(some)(trouble)(with)(once)"
re.findAllMatchIn(str).map(_.group(1)).toList
res2: List[String] = List(The, example, of, repeating, pattern, in, a, single, string, I, had,
some, trouble, with, once)
https://riptutorial.com/ 131
Chapter 37: Scala.js
Introduction
Scala.js is a port from Scala that compiles to JavaScript, which at the end will be running outside
the JVM. It has benefits as strong typing, code optimization at compile time, full interoperability with
JavaScript libraries.
Examples
console.log in Scala.js
Simple Class
Collections
Manipulating DOM
import org.scalajs.dom
import dom.document
https://riptutorial.com/ 132
target.appendChild(pNode)
}
Sbt dependency
Running
sbt run
sbt ~run
sbt fastOptJS
https://riptutorial.com/ 133
Chapter 38: Scaladoc
Syntax
• Goes above methods, fields, classes or packages.
• Starts with /**
• Each line has an starting * proceding with the comments
• Ends with */
Parameters
Parameter Details
Class specific _
Method specific _
Method, Constructor
_
and/or Class tags
Usage _
Other _
https://riptutorial.com/ 134
Parameter Details
@version detail Provides the version that this portion belongs to.
Examples
Simple Scaladoc to method
/**
* Explain briefly what method does here
* @param x Explain briefly what should be x and how this affects the method.
* @param y Explain briefly what should be y and how this affects the method.
* @return Explain what is returned from execution.
*/
def method(x: Int, y: String): Option[Double] = {
// Method content
}
https://riptutorial.com/ 135
Chapter 39: scalaz
Introduction
Scalaz is a Scala library for functional programming.
It provides purely functional data structures to complement those from the Scala standard library.
It defines a set of foundational type classes (e.g. Functor,Monad) and corresponding instances for a
large number of data structures.
Examples
ApplyUsage
import scalaz._
import Scalaz._
scala> Apply[Option].ap(1.some)(some(intToString))
res1: Option[String] = Some(1)
scala> Apply[Option].ap(none)(some(intToString))
res2: Option[String] = None
FunctorUsage
import scalaz._
import Scalaz._
scala> val len: String => Int = _.length
len: String => Int = $$Lambda$1164/969820333@7e758f40
scala> Functor[Option].map(Some("foo"))(len)
res0: Option[Int] = Some(3)
scala> Functor[Option].map(None)(len)
res1: Option[Int] = None
https://riptutorial.com/ 136
scala> :kind Functor
scalaz.Functor's kind is X[F[A]]
ArrowUsage
import scalaz._
import Scalaz._
scala> val plus1 = (_: Int) + 1
plus1: Int => Int = $$Lambda$1167/1113119649@6a6bfd97
https://riptutorial.com/ 137
Chapter 40: Scope
Introduction
Scope on Scala defines where a value (def, val, var or class) can be accessed from.
Syntax
• declaration
• private declaration
• private[this] declaration
• private[fromWhere] declaration
• protected declaration
• protected[fromWhere] declaration
Examples
Public (default) scope
By default, the scope is public, the value can be accessed from anywhere.
package com.example {
class FooClass {
val x = "foo"
}
}
package an.other.package {
class BarClass {
val foo = new com.example.FooClass
foo.x // <- Accessing a public value from another package
}
}
A private scope
When the scope is private, it can only be accessed from the current class or other instances of the
current class.
package com.example {
class FooClass {
private val x = "foo"
def aFoo(otherFoo: FooClass) {
otherFoo.x // <- Accessing from another instance of the same class
}
}
class BarClass {
val f = new FooClass
https://riptutorial.com/ 138
f.x // <- This will not compile
}
}
You can specify a package where the private value can be accessed.
package com.example {
class FooClass {
private val x = "foo"
private[example] val y = "bar"
}
class BarClass {
val f = new FooClass
f.x // <- Will not compile
f.y // <- Will compile
}
}
The most restrictive scope is "object-private" scope, which only allows that value to be accessed
from the same instance of the object.
class FooClass {
private[this] val x = "foo"
def aFoo(otherFoo: FooClass) = {
otherFoo.x // <- This will not compile, accessing x outside the object instance
}
}
Protected scope
The protected scope allows the value to be accessed from any subclasses of the current class.
class FooClass {
protected val x = "foo"
}
class BarClass extends FooClass {
val y = x // It is a subclass instance, will compile
}
class ClassB {
val f = new FooClass
f.x // <- This will not compile
}
The package protected scope allows the value to be accessed only from any subclass in a specific
package.
https://riptutorial.com/ 139
package com.example {
class FooClass {
protected[example] val x = "foo"
}
class ClassB extends FooClass {
val y = x // It's in the protected scope, will compile
}
}
package com {
class BarClass extends com.example.FooClass {
val y = x // <- Outside the protected scope, will not compile
}
}
https://riptutorial.com/ 140
Chapter 41: Self types
Syntax
• trait Type { selfId => /other members can refer to selfId in case this means something/ }
• trait Type { selfId: OtherType => /* other members can use selfId and it will be of type
OtherType */
• trait Type { selfId: OtherType1 with OtherType2 => /* selfId is of type OtherType1 and
OtherType2 */
Remarks
Often used with the cake pattern.
Examples
Simple self type example
Self types can be used in traits and classes to define constraints on the concrete classes it is
mixed to. It is also possible to use a different identifier for the this using this syntax (useful when
outer object has to be referenced from an inner object).
Assume you want to store some objects. For that, you create interfaces for the storage and to add
values to a container:
trait Container[+T] {
def add(o: T): Unit
}
trait PermanentStorage[T] {
/* Constraint on self type: it should be Container
* we can refer to that type as `identifier`, usually `this` or `self`
* or the type's name is used. */
identifier: Container[T] =>
This way those are not in the same object hierarchy, but PermanentStorage cannot be implemented
without also implementing Container.
https://riptutorial.com/ 141
Chapter 42: Setting up Scala
Examples
On Linux via dpkg
On Debian-based distributions, including Ubuntu, the most straightforward way is to use the .deb
installation file. Go to the Scala website. Choose the version you want to install then scroll down
and look for scala-x.x.x.deb.
which scala
The response returned should be the equivalent to what you placed in your PATH variable. To
verify that scala is working:
scala
This should start the Scala REPL, and report the version (which, in turn, should match the version
you downloaded).
curl -O http://downloads.lightbend.com/scala/2.xx.x/scala-2.xx.x.tgz
unzip scala-2.xx.x.tgz
mv scala-2.xx.x /usr/local/share/scala
Add the PATH to ~/.profile or ~/.bash_profile or ~/.bashrc by including this text to one of those
files:
$SCALA_HOME=/usr/local/share/scala
export PATH=$SCALA_HOME/bin:$PATH
https://riptutorial.com/ 142
which scala
The response returned should be the equivalent to what you placed in your PATH variable. To verify
that scala is working:
scala
This should start the Scala REPL, and report the version (which, in turn, should match the version
you downloaded).
On Mac OSX computers with MacPorts installed, open a terminal window and type:
This will list all the Scala-related packages available. To install one (in this example the 2.11
version of Scala):
All dependencies will automatically be installed and your $PATH parameter updated. To verify
everything worked:
which scala
scala
This will open up the Scala REPL, and report the version number installed.
https://riptutorial.com/ 143
Chapter 43: Single Abstract Method Types
(SAM Types)
Remarks
Single Abstract Methods are types, introduced in Java 8, that have exactly one abstract member.
Examples
Lambda Syntax
NOTE: This is only available in Scala 2.12+ (and in recent 2.11.x versions with the -
Xexperimental -Xfuture compiler flags)
2.11.8
trait Runnable {
def run(): Unit
}
2.11.8
trait Runnable {
def run(): Unit
def concrete: Int = 42
}
https://riptutorial.com/ 144
Chapter 44: Streams
Remarks
Streams are lazily-evaluated, meaning they can be used to implement generators, which will
provide or 'generate' a new item of the specified type on-demand, rather than before the fact. This
ensures only the computations necessary are done.
Examples
Using a Stream to Generate a Random Sequence
genRandom creates a stream of random numbers that has a one in four chance of terminating each
time it's called.
lazy val randos = genRandom // getRandom is lazily evaluated as randos is iterated through
for {
x <- randos
} println(x) // The number of times this prints is effectively randomized.
Note the #:: construct, which lazily recurses: because it is prepending the current random number
to a stream, it does not evaluate the remainder of the stream until it is iterated through.
Streams can be built that reference themselves and thus become infinitely recursive.
// factorial
val fact: Stream[BigInt] = 1 #:: fact.zipWithIndex.map{case (p,x)=>p*(x+1)}
fact.take(10) // (1, 1, 2, 6, 24, 120, 720, 5040, 40320, 362880)
fact(24) // 620448401733239439360000
https://riptutorial.com/ 145
In this context the difference between Var, Val, and Def is interesting. As a def each element is
recalculated every time it is referenced. As a val each element is retained and reused after it's
been calculated. This can be demonstrated by creating a side-effect with each calculation.
// now as val
val fact: Stream[Int] = 1 #:: fact.zipWithIndex.map{case (p,x)=>print("!");p*(x+1)}
fact(5) // !!!!! 120
fact(4) // 24
fact(7) // !! 5040
This also explains why the random number Stream doesn't work as a val.
https://riptutorial.com/ 146
Chapter 45: String Interpolation
Remarks
This feature exists in Scala 2.10.0 and above.
Examples
Hello String Interpolation
println(f"$num%2.2f")
42.00
println(f"$num%e")
4.200000e+01
println(f"$num%a")
0x1.5p5
You can use curly braces to interpolate expressions into string literals:
https://riptutorial.com/ 147
s"${a}" // "A"
s"${f(a)}" // "AA"
Without the braces, scala would only interpolate the identifier after the $ (in this case f). Since
there is no implicit conversion from f to a String this is an exception in this example:
my"foo${bar}baz"
Note that there is no restriction on the arguments or return type of the interpolation function. This
leads us down a dark path where interpolation syntax can be used creatively to construct arbitrary
objects, as illustrated in the following example:
https://riptutorial.com/ 148
let"a=${4}" // Let(a, 4)
let"b=${"foo"}" // error: type mismatch
let"c=" // error: not enough arguments for method let: (value: Int)Let
It is also possible to use Scala's string interpolation feature to create elaborate extractors (pattern
matchers), as perhaps most famously employed in the quasiquotes API of Scala macros.
As an example, consider the following extractor which extracts path segments by constructing a
regular expression from the StringContext parts. We can then delegate most of the heavy lifting to
the unapplySeq method provided by the resulting scala.util.matching.Regex:
"/documentation/scala/1629/string-interpolation" match {
case path"/documentation/${topic}/${id}/${_}" => println(s"$topic, $id")
case _ => ???
}
Note that the path object could also define an apply method in order to behave as a regular
interpolator as well.
You can use the raw interpolator if you want a String to be printed as is and without any escaping
of literals.
With the use of the raw interpolator, you should see the following printed in the console:
https://riptutorial.com/ 149
Monde")
Prints:
https://riptutorial.com/ 150
Chapter 46: Symbol Literals
Remarks
Scala comes with a concept of symbols - strings that are interned, that is: two symbols with the
same name (the same character sequence), in contrary to strings, will refer to the same object
during execution.
Symbols are a feature of many languages: Lisp, Ruby and Erlang and more, however in Scala
they are of relatively small use. Good feature to have nevertheless.
Use:
Any literal beginning with a single quote ', followed by one or more digits, letters, or under‐scores
_ is a symbol literal. The first character is an exception as it can’t be a digit.
Good definitions:
'ATM
'IPv4
'IPv6
'map_to_operations
'data_format_2006
Symbol("hakuna matata")
Symbol("To be or not to be that is a question")
Bad definitions:
'8'th_division
'94_pattern
'bad-format
Examples
Replacing strings in case clauses
Let's say we have multiple data sources which include database, file, prompt and argumentList.
Depending on chosen source we change our approach:
https://riptutorial.com/ 151
We could have very well used String in place of Symbol. We didn't, because none of strings's
features are useful in this context.
https://riptutorial.com/ 152
Chapter 47: synchronized
Syntax
• objectToSynchronizeOn.synchronized { /* code to run */}
• synchronized {/* code to run, can be suspended with wait */}
Examples
synchronize on an object
synchronizedis a low-level concurrency construct that can help preventing multiple threads access
the same resources. Introduction for the JVM using the Java language.
anInstance.synchronized {
// code to run when the intristic lock on `anInstance` is acquired
// other thread cannot enter concurrently unless `wait` is called on `anInstance` to suspend
// other threads can continue of the execution of this thread if they `notify` or
`notifyAll` `anInstance`'s lock
}
In case of objects it might synchronize on the class of the object, not on the singleton instance.
https://riptutorial.com/ 153
Chapter 48: Testing with ScalaCheck
Introduction
ScalaCheck is a library written in Scala and used for automated property-based testing of Scala or
Java programs. ScalaCheck was originally inspired by the Haskell library QuickCheck, but has
also ventured into its own.
ScalaCheck has no external dependencies other than the Scala runtime, and works great with sbt,
the Scala build tool. It is also fully integrated in the test frameworks ScalaTest and specs2.
Examples
Scalacheck with scalatest and error messages
import org.scalatest.prop.Checkers
import org.scalatest.{Matchers, WordSpecLike}
import org.scalacheck.Gen._
import org.scalacheck.Prop._
import org.scalacheck.Prop
object Splitter {
def splitLineByColon(message: String): (String, String) = {
val (command, argument) = message.indexOf(":") match {
case -1 =>
(message, "")
case x: Int =>
(message.substring(0, x), message.substring(x + 1))
}
(command.trim, argument.trim)
}
https://riptutorial.com/ 154
}
}
}
}
}
}
}
}
https://riptutorial.com/ 155
}
}
https://riptutorial.com/ 156
Chapter 49: Testing with ScalaTest
Examples
Hello World Spec Test
Create a testing class in the src/test/scala directory, in a file named HelloWorldSpec.scala. Put this
inside the file:
• This example is making use of FlatSpec and Matchers, which are part of the ScalaTest library.
• FlatSpec allows tests to be written in the Behavior-Driven Development (BDD) style. In this
style, a sentence is used to describe the expected behavior of a given unit of code. The test
confirms that the code adheres to that behavior. See the documentation for additional
information.
Setup
Type check
Note that the brackets here are used to get type String.
Equal check
To test equality:
https://riptutorial.com/ 157
helloWorld shouldEqual "Hello World"
helloWorld should === ("Hello World")
helloWorldCount shouldEqual 1
helloWorldCount shouldBe 1
helloWorldList shouldEqual List("Hello World", "Bonjour Le Monde")
helloWorldList === List("Hello World", "Bonjour Le Monde")
To test inequality:
Length check
Exceptions check
https://riptutorial.com/ 158
Chapter 50: Traits
Syntax
• trait ATrait { ... }
• class AClass (...) extends ATrait { ... }
• class AClass extends BClass with ATrait
• class AClass extends ATrait with BTrait
• class AClass extends ATrait with BTrait with CTrait
• class ATrait extends BTrait
Examples
Stackable Modification with Traits
You can use traits to modify methods of a class, using traits in stackable fashion.
The following example shows how traits can be stacked. The ordering of the traits are important.
Using different order of traits, different behavior is achieved.
class Ball {
def roll(ball : String) = println("Rolling : " + ball)
}
object Balls {
def main(args: Array[String]) {
val ball1 = new Ball with Shiny with Red
ball1.roll("Ball-1") // Rolling : Shiny-Red-Ball-1
Note that super is used to invoke roll() method in both the traits. Only in this way we can achieve
stackable modification. In cases of stackable modification, method invocation order is determined
by linearization rule.
https://riptutorial.com/ 159
Trait Basics
trait Identifiable {
def getIdentifier: String
def printIndentification(): Unit = println(getIdentifier)
}
Since no super class is declared for trait Identifiable, by default it extends from AnyRef class.
Because no definition for getIdentifier is provided in Identifiable, the Puppy class must implement
it. However, Puppy inherits the implementation of printIdentification from Identifiable.
In the REPL:
The diamond problem, or multiple inheritance, is handled by Scala using Traits, which are similar
to Java interfaces. Traits are more flexible than interfaces and can include implemented methods.
This makes traits similar to mixins in other languages.
Scala does not support inheritance from multiple classes, but a user can extend multiple traits in a
single class:
trait traitA {
def name = println("This is the 'grandparent' trait.")
}
grandChild.name
https://riptutorial.com/ 160
Here grandChild is inheriting from both traitB and traitC, which in turn both inherit from traitA.
The output (below) also shows the order of precedence when resolving which method
implementations are called first:
C is a child of A.
B is a child of A.
This is the 'grandparent' trait.
Note that, when super is used to invoke methods in class or trait, linearization rule come into play
to decide call hierarchy. Linearization order for grandChild will be:
grandChild -> traitC -> traitB -> traitA -> AnyRef -> Any
trait Printer {
def print(msg : String) = println (msg)
}
object TestPrinter{
def main(args: Array[String]) {
new CustomPrinter().print("Hello World!")
}
}
*************
-------------
Hello World!
CustomPrinter -> DelimitWithStar -> DelimitWithHyphen -> Printer -> AnyRef -> Any
Linearization
https://riptutorial.com/ 161
In case of stackable modification, Scala arranges classes and traits in a linear order to determine
method call hierarchy, which is known as linearization. The linearization rule is used only for those
methods that involve method invocation via super(). Let's consider this by an example:
class Shape {
def paint (shape: String): Unit = {
println(shape)
}
}
because, in MyShape's linearization until now (Step 2), Shape -> AnyRef -> Any has already
appeared. Hence, it is ignored. Thus, the Blue linearization will be:
Blue -> Color -> Dotted -> Border -> Shape -> AnyRef -> Any
https://riptutorial.com/ 162
4. Finally, Circle will be added and final linearization order will be:
Circle -> Blue -> Color -> Dotted -> Border -> Shape -> AnyRef -> Any
This linearization order decides invocation order of methods when super is used in any class or
trait. The first method implementation from the right is invoked, in the linearization order. If new
MyShape().paint("Circle ") is executed, it will print:
https://riptutorial.com/ 163
Chapter 51: Tuples
Remarks
Why are tuples limited to length 23?
Tuples are rewritten as objects by the compiler. The compiler has access to Tuple1 through Tuple22
. This arbitrary limit was decided by language designers.
Examples
Creating a new Tuple
A tuple is a heterogeneous collection of two to twenty-two values. A tuple can be defined using
parentheses. For tuples of size 2 (also called a 'pair') there's an arrow syntax.
x is a tuple of size two. To access the elements of a tuple use ._1, through ._22. For instance, we
can use x._1 to access the first element of the x tuple. x._2 accesses the second element. More
elegantly, you can use tuple extractors.
The arrow syntax for creating tuples of size two is primarily used in Maps, which are collections of
(key -> value) pairs:
scala> m + x
res0: scala.collection.immutable.Map[Int,String] = Map(2 -> world, 1 -> hello)
scala> (m + x).toList
res1: List[(Int, String)] = List((2,world), (1,hello))
The syntax for the pair in the map is the arrow syntax, making it clear that 1 is the key and a is the
value associated with that key.
https://riptutorial.com/ 164
Tuples are often used within collections but they must be handled in a specific way. For example,
given the following list of tuples:
It may seem natural to add the elements together using implicit tuple-unpacking:
Scala cannot implicitly unpack the tuples in this manner. We have two options to fix this map. The
first is to use the positional accessors _1 and _2:
The other option is to use a case statement to unpack the tuples using pattern matching:
https://riptutorial.com/ 165
Chapter 52: Type Classes
Remarks
To avoid serialization problems, particularly in distributed environments (e.g. Apache Spark), it is a
best practice to implement the Serializable trait for type class instances.
Examples
Simple Type Class
trait Show[A] {
def show(a: A): String
}
Instead of extending a type class, an implicit instance of the type class is provided for each
supported type. Placing these implementations in the companion object of the type class allows
implicit resolution to work without any special imports:
object Show {
implicit val intShow: Show[Int] = new Show {
def show(x: Int): String = x.toString
}
// ..etc
}
If you want to guarantee that a generic parameter passed to a function has an instance of a type
class, use implicit parameters:
Call the above log method like any other method. It will fail to compile if an implicit Show[A]
implementation can't be found for the A you pass to log
https://riptutorial.com/ 166
log(10) // prints: "10"
log(new java.util.Date(1469491668401L) // prints: "1469491668401"
log(List(1,2,3)) // fails to compile with
// could not find implicit value for evidence parameter of type
Show[List[Int]]
This example implements the Show type class. This is a common type class used to convert
arbitrary instances of arbitrary types into Strings. Even though every object has a toString
method, it's not always clear whether or not toString is defined in a useful way. With use of the
Show type class, you can guarantee that anything passed to log has a well-defined conversion to
String.
trait Show[A] {
def show: String
}
To make a class you control (and is written in Scala) extend the type class, add an implicit to its
companion object. Let us show how we can get the Person class from this example to extend Show:
We can make this class extend Show by adding an implicit to Person's companion object:
object Person {
implicit val personShow: Show[Person] = new Show {
def show(p: Person): String = s"Person(${p.fullname})"
}
}
A companion object must be in the same file as the class, so you need both class Person and
object Person in the same file.
To make a class you do not control, or is not written in Scala, extend the type class, add an implicit
to the companion object of the type class, as shown in the Simple Type Class example.
If you control neither the class nor the type class, create an implicit as above anywhere, and
import it. Using the log method on the Simple Type Class example:
object MyShow {
implicit val personShow: Show[Person] = new Show {
def show(p: Person): String = s"Person(${p.fullname})"
}
}
https://riptutorial.com/ 167
import MyShow.personShow
persons foreach { p => log(p) }
}
Scala's implementation of type classes is rather verbose. One way to reduce the verbosity is to
introduce so-called "Operation Classes". These classes will automatically wrap a variable/value
when they are imported to extend functionality.
object Instances {
import Instances._
val three = addableInt.add(1,2)
We would rather just write write 1.add(2). Therefore we'll create an "Operation Class" (also called
an "Ops Class") that will always wrap over a type that implements Addable.
object Ops {
implicit class AddableOps[A](self: A)(implicit A: Addable[A]) {
def add(other: A): A = A.add(self, other)
}
}
Now we can use our new function add as if it was part of Int and String:
object Main {
https://riptutorial.com/ 168
import Instances._ // import evidence objects into this scope
import Ops._ // import the wrappers
println(1.add(5))
println("mag".add("net"))
// println(1.add(3.141)) // Fails because we didn't create an instance for Double
}
}
import simulacrum._
https://riptutorial.com/ 169
Chapter 53: Type Inference
Examples
Local Type Inference
Scala has a powerful type-inference mechanism built-in to the language. This mechanism is
termed as 'Local Type Inference':
The compiler can infer the type of variables from the initialization expression. Similarly, the return
type of methods can be omitted, since they are equivalent to the type returned by the method
body. The above examples are equivalent to the below, explicit type declarations:
val i: Int = 1 + 2
val s: String = "I am a String"
def squared(x : Int): Int = x*x
The Scala compiler can also deduce type parameters when polymorphic methods are called, or
when generic classes are instantiated:
The above form of type inference is similar to the Diamond Operator, introduced in Java 7.
Limitations to Inference
There are scenarios in which Scala type-inference does not work. For instance, the compiler
cannot infer the type of method parameters:
https://riptutorial.com/ 170
// Does not compile
def factorial(n: Int) = if (n == 0 || n == 1) 1 else n * factorial(n - 1)
// Compiles
def factorial(n: Int): Int = if (n == 0 || n == 1) 1 else n * factorial(n - 1)
When you try to call it without specifying the generic parameter, Nothing gets inferred, which is not
very useful for an actual implementation (and its result is not useful). With the following solution
the NotNothing context bound can prevent using the method without specifying the expected type
(in this example RuntimeClass is also excluded as for ClassTags not Nothing, but RuntimeClass is
inferred):
object NotNothing {
implicit object notNothing extends NotNothing[Any]
//We do not want Nothing to be inferred, so make an ambigous implicit
implicit object `\n The error is because the type parameter was resolved to Nothing` extends
NotNothing[Nothing]
//For classtags, RuntimeClass can also be inferred, so making that ambigous too
implicit object `\n The error is because the type parameter was resolved to RuntimeClass`
extends NotNothing[RuntimeClass]
}
object ObjectStore {
//Using context bounds
def get[T: NotNothing]: Option[T] = {
???
}
Example usage:
object X {
//Fails to compile
//val nothingInferred = ObjectStore.get
//Fails to compile
//val runtimeClassInferred = ObjectStore.newArray()
}
https://riptutorial.com/ 171
Read Type Inference online: https://riptutorial.com/scala/topic/4918/type-inference
https://riptutorial.com/ 172
Chapter 54: Type Parameterization (Generics)
Examples
The Option type
A nice example of a parameterized type is the Option type. It is essentially just the following
definition (with several more methods defined on the type):
// lots of methods...
}
We can also see that this has a parameterized method, fold, which returns something of type B.
Parameterized Methods
The return type of a method can depend on the type of the parameter. In this example, x is the
parameter, A is the type of x, which is known as the type parameter.
f(1) // 1
f("two") // "two"
f[Float](3) // 3.0F
Scala will use type inference to determine the return type, which constrains what methods may be
called on the parameter. Thus, care must be taken: the following is a compile-time error because *
is not defined for every type A:
Generic collection
https://riptutorial.com/ 173
Defining the list of Ints
class Cons(val head: Int, val tail: IntList) extends IntList { ... }
trait List[T] {
def isEmpty: Boolean
def head: T
def tail: List[T]
}
https://riptutorial.com/ 174
Chapter 55: Type Variance
Examples
Covariance
The + symbol marks a type parameter as covariant - here we say that "Producer is covariant on A":
trait Producer[+A] {
def produce: A
}
A covariant type parameter can be thought of as an "output" type. Marking A as covariant asserts
that Producer[X] <: Producer[Y] provided that X <: Y. For example, a Producer[Cat] is a valid
Producer[Animal], as all produced cats are also valid animals.
A covariant type parameter cannot appear in contravariant (input) position. The following example
will not compile as we are asserting that Co[Cat] <: Co[Animal], but Co[Cat] has def handle(a: Cat):
Unit which cannot handle any Animal as required by Co[Animal]!
trait Co[+A] {
def produce: A
def handle(a: A): Unit
}
One approach to dealing with this restriction is to use type parameters bounded by the covariant
type parameter. In the following example, we know that B is a supertype of A. Therefore given
Option[X] <: Option[Y] for X <: Y, we know that Option[X]'s def getOrElse[B >: X](b: => B): B can
accept any supertype of X - which includes the supertypes of Y as required by Option[Y]:
trait Option[+A] {
def getOrElse[B >: A](b: => B): B
}
Invariance
By default all type parameters are invariant - given trait A[B], we say that "A is invariant on B".
This means that given two parametrizations A[Cat] and A[Animal], we assert no sub/superclass
relationship between these two types - it does not hold that A[Cat] <: A[Animal] nor that A[Cat] >:
A[Animal] regardless of the relationship between Cat and Animal.
Variance annotations provide us with a means of declaring such a relationship, and imposes rules
on the usage of type parameters so that the relationship remains valid.
Contravariance
The - symbol marks a type parameter as contravariant - here we say that "Handler is contravariant
https://riptutorial.com/ 175
on A":
trait Handler[-A] {
def handle(a: A): Unit
}
A contravariant type parameter cannot appear in covariant (output) position. The following
example will not compile as we are asserting that a Contra[Animal] <: Contra[Cat], however a
Contra[Animal] has def produce: Animal which is not guaranteed to produce cats as required by
Contra[Cat]!
trait Contra[-A] {
def handle(a: A): Unit
def produce: A
}
Beware however: for the purposes of overloading resolution, contravariance also counterintuitively
inverts the specificity of a type on the contravariant type parameter - Handler[Animal] is considered
to be "more specific" than Handler[Cat].
As it is not possible to overload methods on type parameters, this behavior generally only
becomes problematic when resolving implicit arguments. In the following example ofCat will never
be used, as the return type of ofAnimal is more specific:
implicitly[Handler[Cat]].handle(new Cat)
This behavior is currently slated to change in dotty, and is why (as an example)
scala.math.Ordering is invariant on its type parameter T. One workaround is to make your typeclass
invariant, and type-parametrize the implicit definition in the event that you want it to apply to
subclasses of a given type:
trait Person
object Person {
implicit def ordering[A <: Person]: Ordering[A] = ???
}
Covariance of a collection
Because collections are typically covariant in their element type*, a collection of a subtype may be
passed where a super type is expected:
https://riptutorial.com/ 176
case class Dog(name: String) extends Animal
object Animal {
def printAnimalNames(animals: Seq[Animal]) = {
animals.foreach(animal => println(animal.name))
}
}
Animal.printAnimalNames(myDogs)
// Curly
// Larry
// Moe
It may not seem like magic, but the fact that a Seq[Dog] is accepted by a method that expects a
Seq[Animal] is the entire concept of a higher-kinded type (here: Seq) being covariant in its type
parameter.
There is also a way to have a single method accept a covariant argument, instead of having the
whole trait covariant. This may be necessary because you would like to use T in a contravariant
position, but still have it covariant.
trait LocalVariance[T]{
/// ??? throws a NotImplementedError
def produce: T = ???
// the implicit evidence provided by the compiler confirms that S is a
// subtype of T.
def handle[S](s: S)(implicit evidence: S <:< T) = {
// and we can use the evidence to convert s into t.
val t: T = evidence(s)
???
}
}
trait A {}
trait B extends A {}
object Test {
val lv = new LocalVariance[A] {}
https://riptutorial.com/ 177
Chapter 56: Type-level Programming
Examples
Introduction to type-level programming
If we consider a heterogenous list, wherein the elements of the list have varied but known types, it
might be desirable to be able to perform operations on the elements of the list collectively without
discarding the elements' type information. The following example implements a mapping operation
over a simple heterogenous list.
Because the element type varies, the class of operations we can perform is restricted to some
form of type projection, so we define a trait Projection having abstract type Apply[A] computing the
result type of the projection, and def apply[A](a: A): Apply[A] computing the result value of the
projection.
trait Projection {
type Apply[A] // <: Any
def apply[A](a: A): Apply[A]
}
In implementing type Apply[A] we are programming at the type level (as opposed to the value
level).
Our heterogenous list type defines a map operation parametrized by the desired projection as well
as the projection's type. The result of the map operation is abstract, will vary by implementing
class and projection, and must naturally still be an HList:
In the case of HNil, the empty heterogenous list, the result of any projection will always be itself.
Here we declare trait HNil as a convenience so that we may write HNil as a type in lieu of
HNil.type:
HCons is the non-empty heterogenous list. Here we assert that when applying a map operation, the
resulting head type is that which results from the application of the projection to the head value (
P#Apply[H]), and that the resulting tail type is that which results from mapping the projection over
the tail (T#Map[P]), which is known to be an HList:
https://riptutorial.com/ 178
case class HCons[H, T <: HList](head: H, tail: T) extends HList {
type Map[P <: Projection] = HCons[P#Apply[H], T#Map[P]]
def map[P <: Projection](p: P): Map[P] = HCons(p.apply(head), tail.map(p))
}
The most obvious such projection is to perform some form of wrapping operation - the following
example yields an instance of HCons[Option[String], HCons[Option[Int], HNil]]:
https://riptutorial.com/ 179
Chapter 57: User Defined Functions for Hive
Examples
A simple Hive UDF within Apache Spark
import org.apache.spark.sql.functions._
// Create a function that uses the content of the column inside the dataframe
val code = (param: String) => if (param == "myCode") 1 else 0
// With that function, create the udf function
val myUDF = udf(code)
// Apply the udf to a column inside the existing dataframe, creating a dataframe with the
additional new column
val newDataframe = aDataframe.withColumn("new_column_name", myUDF(col(inputColumn)))
https://riptutorial.com/ 180
Chapter 58: Var, Val, and Def
Remarks
As val are semantically static, they are initialized "in-place" wherever they appear in the code. This
can produce surprising and undesirable behavior when used in abstract classes and traits.
For example, let's say we would like to make a trait called PlusOne that defines an increment
operation on a wrapped Int. Since Ints are immutable, the value plus one is known at initialization
and will never be changed afterwards, so semantically it's a val. However, defining it this way will
produce an unexpected result.
trait PlusOne {
val i:Int
val incr = i + 1
}
No matter what value i you construct IntWrapper with, calling .incr on the returned object will
always return 1. This is because the val incr is initialized in the trait, before the extending class,
and at that time i only has the default value of 0. (In other conditions, it might be populated with
Nil, null, or a similar default.)
The general rule, then, is to avoid using val on any value that depends on an abstract field.
Instead, use lazy val, which does not evaluate until it is needed, or def, which evaluates every
time it is called. Note however that if the lazy val is forced to evaluate by a val before initialization
completes, the same error will occur.
A fiddle (written in Scala-Js, but the same behavior applies) can be found here.
Examples
Var, Val, and Def
var
A var is a reference variable, similar to variables in languages like Java. Different objects can be
freely assigned to a var, so long as the given object has the same type that the var was declared
with:
scala> var x = 1
x: Int = 1
scala> x = 2
https://riptutorial.com/ 181
x: Int = 2
Note in the example above the type of the var was inferred by the compiler given the first value
assignment.
val
A val is a constant reference. Thus, a new object cannot be assigned to a val that has already
been assigned.
scala> val y = 1
y: Int = 1
scala> y = 2
<console>:12: error: reassignment to val
y = 2
^
However, the object that a val points to is not constant. That object may be modified:
scala> arr(0) = 1
scala> arr
res1: Array[Int] = Array(1, 0)
def
A def defines a method. A method cannot be re-assigned to.
scala> def z = 1
z: Int
scala> z = 2
<console>:12: error: value z_= is not a member of object $iw
z = 2
^
In the above examples, val y and def z return the same value. However, a def is evaluated when
it is called, whereas a val or var is evaluated when it is assigned. This can result in differing
behavior when the definition has side effects:
https://riptutorial.com/ 182
scala> val a = {println("Hi"); 1}
Hi
a: Int = 1
scala> a + 1
res2: Int = 2
scala> b + 1
Hi
res3: Int = 2
Functions
Because functions are values, they can be assigned to val/var/defs. Everything else works in the
same manner as above:
scala> x(1)
res0: String = value=1
scala> y(2)
res1: String = value=2
scala> z(3)
res2: String = value=3
Lazy val
lazy val is a language feature where the initialization of a val is delayed until it is accessed for the
first time. After that point, it acts just like a regular val.
To use it add the lazy keyword before val. For example, using the REPL:
https://riptutorial.com/ 183
Initializing bar
bar: String = my bar value
scala> foo
Initializing
res3: String = my foo value
scala> bar
res4: String = my bar value
scala> foo
res5: String = my foo value
This example demonstrates the execution order. When the lazy val is declared, all that is saved to
the foo value is a lazy function call that hasn't been evaluated yet. When the regular val is set, we
see the println call execute and the value is assigned to bar. When we evalute foo the first time
we see println execute - but not when it's evaluated the second time. Similarly, when bar is
evaluated we don't see println execute - only when it is declared.
takes a long time to calculate, and it's not always used. Making it a lazy
tiresomeValue val
saves unnecessary computation.
Let's look at an example with two objects that need to be declared at the same time during
instantiation:
object comicBook {
def main(args:Array[String]): Unit = {
gotham.hero.talk()
gotham.villain.talk()
}
}
https://riptutorial.com/ 184
println(s"Let me loosen up Gotham a little bit ${toKill.name}!")
}
}
object gotham {
val hero: Superhero = new Superhero("Batman")
val villain: Supervillain = new Supervillain("Joker")
}
Without the keyword lazy, the respective objects can not be members of an object.
Execution of such a program would result in a java.lang.NullPointerException. By using lazy,
the reference can be assigned before it is initialized, without fear of having an uninitialized
value.
Overloading Def
This functions the same whether inside classes, traits, objects or not.
Named Parameters
When invoking a def, parameters may be assigned explicitly by name. Doing so means they
needn't be correctly ordered. For example, define printUs() as:
If not all arguments are named, the first arguments are matched by order. No positional (non-
named) argument may follow a named one:
https://riptutorial.com/ 185
printUs("one", two="two", three="three") // prints 'one, two, three'
printUs(two="two", three="three", "one") // fails to compile: 'positional after named
argument'
https://riptutorial.com/ 186
Chapter 59: While Loops
Syntax
• while (boolean_expression) { block_expression }
Parameters
Parameter Details
Remarks
The primary difference between while and do-while loops is whether they execute the
block_expression before they check to see if they should loop.
Because while and do-while loops rely on an expression to evaluate to false to terminate, they
often require mutable state to be declared outside the loop and then modified inside the loop.
Examples
While Loops
var line = 0
var maximum_lines = 5
Do-While Loops
var line = 0
var maximum_lines = 5
do {
line = line + 1
println("Line number: " + line)
} while (line < maximum_lines)
https://riptutorial.com/ 187
The do/while loop is infrequently used in functional programming, but can be used to work around
the lack of support for the break/continue construct, as seen in other languages:
if(initial_condition) do if(filter) {
...
} while(continuation_condition)
https://riptutorial.com/ 188
Chapter 60: Working with data in immutable
style
Remarks
Source: http://docs.scala-lang.org/style/naming-conventions.html
This compile:
Examples
It is not just val vs. var
scala> a = 456
<console>:8: error: reassignment to val
a = 456
scala> b = 321
b: Int = 321
• val references are unchangeable: like a final variable in Java, once it has been initialized
https://riptutorial.com/ 189
you cannot change it
• var references are reassignable as a simple variable declaration in Java
scala> mut
Map(123 -> 123, 456 -> 456, 789 -> 789)
scala> imm
Map()
scala> imm + ("123" -> 123) + ("456" -> 456) + ("789" -> 789)
Map(123 -> 123, 456 -> 456, 789 -> 789)
The Scala standard library offers both immutable and mutable data structures, not the reference to
it. Each time an immutable data structure get "modified", a new instance is produced instead of
modifying the original collection in-place. Each instance of the collection may share significant
structure with another instance.
Let's pick as an example a function that takes 2 Map and return a Map containing every element in ma
and mb:
A first attempt could be iterating through the elements of one of the maps using for ((k, v) <-
map) and somehow return the merged map.
This very first move immediately add a constrain: a mutation outside that for is now needed.
This is more clear when de-sugaring the for:
https://riptutorial.com/ 190
// this:
for ((k, v) <- map) { ??? }
// is equivalent to:
map.foreach { case (k, v) => ??? }
foreach relies on side-effects. Every time we want something to happen within a foreach we need
to "side-effect something", in this case we could mutate a variable var result or we can use a
mutable data structure.
Let's assume the ma and mb are scala.collection.immutable.Map, we could create the result Map
from ma:
Then iterate through mb adding its elements and if the key of the current element on ma already
exist, let's override it with the mb one.
Mutable implementation
So far so good, we "had to use mutable collections" and a correct implementation could be:
As expected:
scala> merge2Maps(Map("a" -> 11, "b" -> 12), Map("b" -> 22, "c" -> 23))
Map(a -> 11, b -> 22, c -> 23)
How can we get rid of foreach in this scenario? If all we what to do is basically iterate over the
collection elements and apply a function while accumulating the result on option could be using
.foldLeft:
https://riptutorial.com/ 191
In this case our "result" is the accumulated value starting from ma, the zero of the .foldLeft.
Intermediate result
Obviously this immutable solution is producing and destroying many Map instances while folding,
but it is worth mentioning that those instances are not a full clone of the Map accumulated but
instead are sharing significant structure (data) with the existing instance.
Easier reasonability
It is easier to reason about the semantic if it is more declarative as the .foldLeft approach. Using
immutable data structures could help making our implementation easier to reason on.
https://riptutorial.com/ 192
Chapter 61: Working With Gradle
Examples
Basic Setup
group 'scala_gradle'
version '1.0-SNAPSHOT'
repositories {
jcenter()
mavenCentral()
maven {
url "https://repo.typesafe.com/typesafe/maven-releases"
}
}
dependencies {
compile group: 'org.scala-lang', name: 'scala-library', version: '2.10.6'
}
After going through the Basic Setup example, you may find yourself repeating most part of it in
every single Scala Gradle project. Smells like boilerplate code...
What if, instead of applying the Scala plugin offered by Gradle, you could apply your own Scala
plugin, which would be responsible for handling all your common build logic, extending, at the
same time, the already existing plugin.
This example is going to transform the previous build logic into a reusable Gradle plugin.
Luckyly, in Gradle, you can easily write custom plugins with the help of the Gradle API, as outlined
in the documentation. As language of implementation, you can use Scala itself or even Java.
https://riptutorial.com/ 193
However, most of the examples you can find throughout the docs are written in Groovy. If you
need more code samples or you want to understand what lies behind the Scala plugin, for
instance, you can check the gradle github repo.
The custom plugin will add the following functionality when applied to a project:
• a scalaVersion property object, which will have two overridable default properties
○major = "2.12"
○minor = "0"
• a withScalaVersion function, which applied to a dependency name, will add the scala major
version to ensure binary compatibility (sbt %% operator might ring a bell, otherwise go here
before proceeding)
• a createDirs task to create the necessary directory tree, exactly as in the previous example
Implementation guideline
repositories {
mavenLocal()
mavenCentral()
}
dependencies {
compile gradleApi()
compile "org.scala-lang:scala-library:2.12.0"
}
Notes:
• the plugin implementation is written in Scala, thus we need the Gradle Scala Plugin
• in order to use the plugin from other projects, the Gradle Maven Plugin is used; this adds the
install task used for saving the project jar to the Maven Local Repository
• compile gradleApi() adds the gradle-api-<gradle_version>.jar to the classpath
package com.btesila.gradle.plugins
https://riptutorial.com/ 194
}
}
Notes:
• in order to implement a Plugin, just extend Plugin trait of type Project and override the apply
method
• within the apply method, you have access to the Project instance that the plugin is applied to
and you can use it for adding build logic to it
• this plugin does nothing but apply the already existing Gradle Scala Plugin
Firstly, we create a ScalaVersion class, which will hold the two version properties
class ScalaVersion {
var major: String = "2.12"
var minor: String = "0"
}
One cool thing about Gradle plugins is the fact that you can always add or override specific
properties. A plugin receives this kind of user input via the ExtensionContainer attached to a gradle
Project instance. For more details, check this out.
By adding the following to the apply method, we are basically doing this:
• if there is not a scalaVersion property defined in the project, we add one with the default
values
• otherwise, we get the existing one as instance of ScalaVersion, to use it further
This is equivalent to writing the following to the build file of the project that applies the plugin:
ext {
scalaVersion.major = "2.12"
scalaVersion.minor = "0"
4. add the scala-lang library to the project dependencies, using the scalaVersion
project.getDependencies.add("compile", s"org.scala-lang:scala-
library:${scalaVersion.major}.${scalaVersion.minor}")
https://riptutorial.com/ 195
This is equivalent to writing the following to the build file of the project that applies the plugin:
compile "org.scala-lang:scala-library:2.12.0"
Note: the SourceSetContainer has information about all source directories present in the project.
What the Gradle Scala Plugin does, is to add the extra source sets to the Java ones, as you can
see in theplugin docs.
Add the createDir task to the project by appending this to the apply method:
project.getTasks.create("createDirs", classOf[CreateDirs])
project.getDependencies.add("compile", s"org.scala-lang:scala-
library:${scalaVersion.major}.${scalaVersion.minor}")
https://riptutorial.com/ 196
val withScalaVersion = (lib: String) => {
val libComp = lib.split(":")
libComp.update(1, s"${libComp(1)}_${scalaVersion.major}")
libComp.mkString(":")
}
project.getExtensions.getExtraProperties.set("withScalaVersion", withScalaVersion)
project.getTasks.create("createDirs", classOf[CreateDirs])
}
}
Each Gradle plugin has an id which is used in the apply statement. For instance, by writing the
following to the build file, it translates to a trigger to Gradle to find and apply the plugin with id
scala.
In the same way, we would like to apply our new plugin in the following way,
implementation-class=com.btesila.gradle.plugins.ScalaCustomPlugin
buildscript {
repositories {
mavenLocal()
mavenCentral()
}
dependencies {
https://riptutorial.com/ 197
//modify this path to match the installed plugin project in your local repository
classpath 'com.btesila:working-with-gradle:1.0-SNAPSHOT'
}
}
repositories {
mavenLocal()
mavenCentral()
}
2. run gradle createDirs - you should now have all the source directories generated
3. override the scala version by adding this to the build file:
ext {
scalaVersion.major = "2.11"
scalaVersion.minor = "8"
}
println(project.ext.scalaVersion.major)
println(project.ext.scalaVersion.minor)
4. add a dependency library that is binary compatible with the Scala version
dependencies {
compile withScalaVersion("com.typesafe.scala-logging:scala-logging:3.5.0")
}
That's it! You can now use this plugin across all your projects without repeating the same old
boilerplate.
https://riptutorial.com/ 198
Chapter 62: XML Handling
Examples
Beautify or Pretty-Print XML
The PrettyPrinter utility will 'pretty print' XML documents. The following code snippet pretty prints
unformatted xml:
This will output the content using a page width of 150 and an indentation constant of 4 white-space
characters:
<a>
Alana
<b>
<c>Beth</c>
<d>Catie</d>
</b>
</a>
You can use XML.loadFile("nameoffile.xml") to load xml from a file instead of from a string.
https://riptutorial.com/ 199
Credits
S.
Chapters Contributors
No
Dependency
9 Hoang Ong
Injection
https://riptutorial.com/ 200
Andy Hayden, Dan Hulme, Dan Simon, Gábor Bakos, gilad
13 Extractors hoch, Idloj, J Cracknell, jwvh, knutwalker, Łukasz, Martin Seeler,
Michael Ahlers, Nathaniel Ford, Suma, W.P. McNeill
Handling units
17 Gábor Bakos
(measures)
Higher Order
18 acjay, ches, Nathaniel Ford, nukie, Rajat Jain, Srini
Function
Operator
25 corvus_192, implicitdef, inzi, mnoronha, Nathaniel Ford, Simon
Overloading
https://riptutorial.com/ 201
30 Parser Combinators Nathaniel Ford
33 Quasiquotes gregghz
39 scalaz chengpohi
Single Abstract
43 Method Types (SAM Gábor Bakos, Gabriele Petronella, Nathaniel Ford
Types)
Testing with
48 Andrzej Jozwik
ScalaCheck
Testing with
49 Nadim Bahadoor, Nathaniel Ford
ScalaTest
https://riptutorial.com/ 202
André Laszlo, Andy Hayden, Donald.McLean, Louis F.,
50 Traits
Nathaniel Ford, Rumoku, Sudhir Singh, Vogon Jeltz
Type
54 Parameterization akauppi, Andy Hayden, Eero Helenius, Nathaniel Ford, vivek
(Generics)
Type-level
56 J Cracknell
Programming
User Defined
57 Camilo Sampedro
Functions for Hive
https://riptutorial.com/ 203