1. Map
1.1 tectonic Map
// Initialize an empty map
val scores01 = new HashMap[String, Int]
// Initialize the Map from the specified value (method 1)
val scores02 = Map("hadoop" -> 10."spark" -> 20."storm" -> 30)
// Initialize the Map from the specified value (mode 2)
val scores03 = Map(("hadoop".10), ("spark".20), ("storm".30))
Copy the code
All maps obtained in the preceding ways are immutable maps. To obtain a mutable Map, run the following command:
val scores04 = scala.collection.mutable.Map("hadoop" -> 10."spark" -> 20."storm" -> 30)
Copy the code
Get the value 1.2
object ScalaApp extends App {
val scores = Map("hadoop" -> 10."spark" -> 20."storm" -> 30)
// 1. Obtain the value of the specified key
println(scores("hadoop"))
// 2. If the corresponding value does not exist, use the default value
println(scores.getOrElse("hadoop01".100))}Copy the code
1.3 Adding/Modifying/Deleting a Value
You can add, modify, and delete a variable Map.
object ScalaApp extends App {
val scores = scala.collection.mutable.Map("hadoop" -> 10."spark" -> 20."storm" -> 30)
// 1. Update the key if it exists
scores("hadoop") = 100
// 2. If the key does not exist, add the key
scores("flink") = 40
// 3. Multiple updates or new operations can be performed with +=
scores += ("spark" -> 200."hive" -> 50)
// 4. A key and value can be removed with -=
scores -= "storm"
for (elem <- scores) {println(elem)}
}
// The output is as follows
(spark,200)
(hadoop,100)
(flink,40)
(hive,50)
Copy the code
An immutable Map cannot be added, modified, or deleted, but a new Map can be generated from an immutable Map.
object ScalaApp extends App {
val scores = Map("hadoop" -> 10."spark" -> 20."storm" -> 30)
val newScores = scores + ("spark" -> 200."hive" -> 50)
for (elem <- scores) {println(elem)}
}
// The output is as follows
(hadoop,10)
(spark,200)
(storm,30)
(hive,50)
Copy the code
1.4 through the Map
object ScalaApp extends App {
val scores = Map("hadoop" -> 10."spark" -> 20."storm" -> 30)
// 1. Iterate over the key
for (key <- scores.keys) { println(key) }
// 2. Iterate over the values
for (value <- scores.values) { println(value) }
// 3. Iterate over the key-value pairs
for ((key, value) <- scores) { println(key + ":" + value) }
}
Copy the code
1.5 Yield keyword
You can use the yield keyword to generate a new Map from an existing Map.
object ScalaApp extends App {
val scores = Map("hadoop" -> 10."spark" -> 20."storm" -> 30)
// 1. Multiply scores by 10
val newScore = for ((key, value) <- scores) yield (key, value * 10)
for (elem <- newScore) { println(elem) }
// 2. Swap keys and values
val reversalScore: Map[Int, String] = for ((key, value) <- scores) yield (value, key)
for (elem <- reversalScore) { println(elem) }
}
/ / output
(hadoop,100)
(spark,200)
(storm,300)
(10,hadoop)
(20,spark)
(30,storm)
Copy the code
1.6 Other Map Structures
When using a Map, if not specified, HashMap is used by default. If you want to use TreeMap or LinkedHashMap, you need to specify it explicitly.
object ScalaApp extends App {
// 1. Use TreeMap to sort the keys lexicographically
val scores01 = scala.collection.mutable.TreeMap("B" -> 20."A" -> 10."C" -> 30)
for (elem <- scores01) {println(elem)}
// 2. Use LinkedHashMap to sort key-value pairs by insertion order
val scores02 = scala.collection.mutable.LinkedHashMap("B" -> 20."A" -> 10."C" -> 30)
for (elem <- scores02) {println(elem)}
}
/ / output
(A,10)
(B,20)
(C,30)
(B,20)
(A,10)
(C,30)
Copy the code
1.7 Optional Methods
object ScalaApp extends App {
val scores = scala.collection.mutable.TreeMap("B" -> 20."A" -> 10."C" -> 30)
// 1. Get the length
println(scores.size)
// 2. Check whether it is empty
println(scores.isEmpty)
// 3. Check whether a specific key is included
println(scores.contains("A"))}Copy the code
1.8 Interoperating with Java
import java.util
import scala.collection.{JavaConverters, mutable}
object ScalaApp extends App {
val scores = Map("hadoop" -> 10."spark" -> 20."storm" -> 30)
// Scala map to Java map
val javaMap: util.Map[String, Int] = JavaConverters.mapAsJavaMap(scores)
// Java map to Scala map
val scalaMap: mutable.Map[String, Int] = JavaConverters.mapAsScalaMap(javaMap)
for (elem <- scalaMap) {println(elem)}
}
Copy the code
2. Tuple
Tuples are similar to arrays, except that all elements in an array must be of the same type, whereas tuples can contain elements of different types.
scala> val tuple=(1.3.24 f."scala")
tuple: (Int, Float, String) = (1.3.24,scala)
Copy the code
2.1 Pattern Matching
Pattern matching can be used to get values in tuples and assign corresponding variables:
scala> val (a,b,c)=tuple
a: Int = 1
b: Float = 3.24
c: String = scala
Copy the code
If some locations do not require an assignment, underline can be used instead:
scala> val (a,_,_)=tuple
a: Int = 1
Copy the code
2.2 zip method
object ScalaApp extends App {
val array01 = Array("hadoop"."spark"."storm")
val array02 = Array(10.20.30)
The 1.zip method returns an array of tuples
val tuples: Array[(String, Int)] = array01.zip(array02)
// 2. You can also call the toMap method after zip to convert toMap
val map: Map[String, Int] = array01.zip(array02).toMap
for (elem <- tuples) { println(elem) }
for (elem <- map) {println(elem)}
}
/ / output
(hadoop,10)
(spark,20)
(storm,30)
(hadoop,10)
(spark,20)
(storm,30)
Copy the code
The resources
- Martin Odersky. Scala Programming (3rd edition)[M]. Publishing House of Electronics Industry. 2018-1-1
- Kay S. Horstman. Learn Scala quickly (2nd edition)[M]. Publishing House of Electronics Industry. 2017-7
See the GitHub Open Source Project: Getting Started with Big Data for more articles in the big Data series