background
In the development of GrowingIO server, we use gRPC to carry out data communication between micro-services. Each project providing services must define a set of its own Protobuf message format, and then use ProtoC to generate the corresponding language adaptation code. In the project we maintain, we mainly use Scala to implement each service, each service defines its own domain model (usually some case classes), and protoc generates JVM platform code in Java language by default. The corresponding Protobuf message format has some correspondence with the domain model defined in Scala projects, and their attributes tend to be highly consistent. There is a lot of conversion code between Protobuf-Java and Scala Case classes when we need to convert data between these two types.
List and Seq/List/Array, Timestamp and ZonedDateTime, In Scala, Option types can also be represented with a protobuf encapsulation type, such as Option[String], which can be represented with a StringValue. Because each type has its own characteristics, and type nesting can greatly increase the complexity, we have been looking for a common conversion scheme that is type-safe to use and minimizes bloated and uninteresting code.
Refer to the
Github.com/playframewo… The Reader, Writer design concept and github.com/scalalandio… Scala macro to type conversion, we finally use Scala macro combined with implicit parameter DSL design ideas to achieve github.com/changvvb/sc… A solution.
Effect of solution
Let’s define a Case class User and Protobuf UserPB here to compare the effect before and after using this scheme.
case class User ( id:Long, name:String, phoneNumber: Option[String], hobbies: Seq[String])
message UserPB ( int64 id = 1, string name = 2, google.protobuf.StringValue phone_number = 3, repeated string hobbies = 4)
Copy the code
If we hand-wrote the Scala case class to protobuf-Java conversion ourselves, it would look like this:
val user = User(1,"Jack",Some("1234567890"),Seq("ping pong", // create a builderbuilder.setid (user.id).setname (user.name) // setId, Name Field if (user. PhoneNumber. IsDefined) {/ / here can also be abbreviated as the user directly. The phoneNumber. The map (StringValue. Of.) foreach (builder. SetPhoneNumber) Builder.setphonenumber (stringValue.of (user.phonenumber.get))} Builder.setallhobbies (user.hobbies. AsJava) // Set hobbies Field val userPB = Builder.build // Build the userPB objectCopy the code
Converting a Protobuf-Java object to a Scala Case Class object would require similar code, except passing in the User constructor the result of some set to GET methods.
Using our solution, the code would look like this:
val user = User(1,"Jack",Some("1234567890"),Seq("ping pong", "reading"))val userPB = Protoable[User, UserPB].toproto (user) // One line of codeCopy the code
As you can see, the code is a dimension of simplicity, and it will also help us do type safety checks. It really achieves the goal of simplicity, security and ease of use.
Here we will introduce the design methods and ideas of this tool, as well as the philosophy of it.
DSL design
The two most fundamental qualities in DSLS are Protoable[-s, +P] and Scalable[+S, -p], with S representing a Scala type and P representing a Protobuf-Java type. A Protoable means something that can convert Scala types to protobuf-Java types, and Scalable types are the opposite. Covariant and contravariant are used here.
trait Protoable[-S, +P] { def toProto(entity: S): P}trait Scalable[+S, -P] { def toScala(proto: P): S}
Copy the code
Next, we need to write some constructors and some default converters in their companion objects. So, there are some basic things.
Object Scalable {def apply[S, P](convert: P): Scalable[S, P] = x => convert(x) Java.lang.Integer](_.toint) // Implicit val stringValueScalable = Scalable[String, Scalable] Scalable[ZonedDateTime = Scalable[ZonedDateTime = Scalable[ZonedDateTime = Scalable]; Timestamp] {string string string string string string string string string string string string string string string string string String string string string string string string string string string string string string string string string string string string string string string string string string string string string string string Protoable[S, P] = x => convert(x) implicit val javaDoubleProtoable = Protoable[Double, java.lang.Double](_.toDouble) implicit val stringValueProtoable = Protoable[String, StringValue](StringValue.of) implicit val zonedDateTimeProtoable = Protoable[ZonedDateTime, Timestamp] {timetime.newBuilder ().setseconds (entity.toepochSecond).setNanos(entity.getnano).build()}Copy the code
Use macros to automatically generate code
Still with the two User and UserPB types defined earlier in this article, how can we use the above DSL to write conversions between them?
To look directly at the result, you can write:
new Protoable[User, UserPB] { override def toProto(entity: User): UserPB = { val builder = UserPB.newBuilder() builder.setId(entity.id) builder.setName(entity.name) if(entity.phoneNumber.isDefined) { builder.setPhoneNumber(implicity[Protoable[String,StringValue]].toProto(entity.phoneNumber)) } builder.addAllHobbies(implicitly[Protoable[Seq[String], java.util.List[String]]].toProto(entity.hobbies)) builder.build }}new Scalable[User, UserPB] { override def toScala(proto: UserPB): User = { new User( id = proto.getId, name = proto.getName, phoneNumber = if(proto.hasPhoneNumber) { Some(implicitly[Scalable[String,StringValue]].toScala(proto.getPhoneNumber)) } else { None }, hobbies = implicitly[Scalable[Seq[String, java.util.List[String]]].toScala(proto.getBobbiesList) ) }}
Copy the code
This is the code we need Scala macros to generate, using the Protoable and Scalable features we defined above, as well as Scala’s implicit parameters, to make it easier to construct abstract syntax trees. The advantages include:
-
Data conversion and processing are all within the framework of our DSL design, enabling problems to be Scalable with both Protoable and Scalable features.
-
Take advantage of the compiler’s implicit parameter lookup feature to have the compiler look for the type converter for a field in context when it involves a different type of conversion. Use the IMPLICITLY [T] method, which is in the Scala standard library, to find a corresponding parameter of type T in the context. Here, for example, we need to find an implicit parameter (defined in Trait Scalable) of type Scalable[String,StringValue].
-
In combination with point 2, when we convert two objects, we don’t need to recursively consider the child object problem. When we generate code, we only need to focus on the relationship between the current object fields, including some simple Option type handling, collection type handling, and nothing else. Everything else is handed over to the compiler with implicit arguments, which greatly reduces the design cost.
-
Easy type extension. If we need to define a system-level converter, add one to Protoable and Scalable companion objects; If we need a business-specific converter, we can define it in the context we can reach in the code, and Scala’s implicit parameter lookup rules will help you find the type converter to extend.
Obviously, we seem to have found a general rule to handle the conversion between fields here, and now we can let Scala’s macros reflect the code we need at compile time. Macro constructors can be defined in their respective companion objects.
object Scalable { def apply[S <: Product, P]: Protoable[S, P] = macro ProtoScalableMacro.protosImpl[S, P]}object Protoable { def apply[S <: Product, P]: Scalable[S, P] = macro ProtoScalableMacro.scalasImpl[S, P]}
Copy the code
As you can see, only two type parameters are required for each method. How macros are implemented is not discussed here, but is a syntax tree (AST) that reflects various types of code at compile time in conjunction with Scala macros.
We use it to do a two-way conversion, which can be done with very simple code.
val user = User(1,"Jack", Some("1234567890"), Seq("ping pong","coding"))val userPb = Protoable[User, userPb]. ToProto (User) // Convert Scala case class objects to protobuf-java Object val user2 = Scalable[User,UserPB].toscala (User) // convert protobuf-java objects toScala case class objects assert(User == user2)Copy the code
As you can see, if you do a conversion, no matter how many fields you have, you only need one line of code, which reduces our code data level by one level, and it’s type-safe, because if you don’t have the right type, if you don’t have enough parameters, you’ll get a compilation error.
For nested types, we simply define an internal conversion and use it in any context the compiler can find. We assume that there is a field of Inner in the Outer type.
implicit val innerScalable = Scalable[Inner,InnerPB]Protoable[Outer,OuterPB].toScala(outerObj)
Copy the code
Further optimization, using the Builder method to customize the internal conversion logic
If we have a business scenario where, for example, we need an ID that is always greater than or equal to 0 when UserPB goes to User, this simple requirement is basically hard to implement, and even if it is implemented, it might be ugly. We introduced the Builder constructor to help us do this. The Builder constructor will help us inject some custom rules into the macro-generated code.
val scalable = ScalableBuilder[User,UserPB] .setField(_.id, UserPB => if(userpb.getid < 0) 0 else userpb.getid).setfield (_. Name, /* Use the previous default */).buildscalable. ToScala (...)Copy the code
SetField takes two arguments, a field selector, and a lambda expression, which is used to indicate that you enter an object, output what the field wants, and finally call the Build method to generate a Scalable object.
In addition to ScalableBuilder, we also have ProtoableBuilder to do reverse direction conversion work. Both of these builders can be used to do some field-level logic control or generate missing fields, which can be useful in many cases.
Scala3 support
As we know, Scala3 has just been officially released in the last two months. Some of the features Scala3 brings to the tool are:
-
More concise implicit parameter definitions
-
New macro design based on inline, Quotes
This makes the DSL design simpler, and the macro implementation has to be completely rewritten.
There are some problems with Scala3
-
Java files generated by Protoc cannot be compiled
Github.com/lampepfl/do…
But I mentioned a PR that is fixing the problem
Github.com/lampepfl/do…
-
Compile-time reflection is a bit less of an API, making it difficult to do some types that involve type derivation
These are all issues we need to overcome in support of SCALA3 in the future.