preface

During this year’s consulting work focusing on DDD design, especially DDD’s collaborative design workshops, I found that clients often had the following conflicts with many consultants:

  • Conflict of “consistency” of experience
    • Clients: expect consistent explanations from different consultants when selling methodology;
    • Consultant: It’s hard to be consistent because everyone’s understanding of methodology is so different.
  • Service “standardization” conflicts
    • Clients: Expect consultants to standardize the methodologies they sell;
    • Consultant: believe that the methodology sold By consultants is very flexible and needs to be adapted By Experience according to different situations. Standardization cannot be done and should not be done.

Combined with my experience in personnel training and teaching at ThoughtWorks for nearly 4 years, and my consulting experience in the past few years, I can understand the client’s request because they want to implement the methodology at scale. One of the most important problems in scaling up methodology is that most people of average ability are more used to working according to “clear instructions” than to relying on their “limited experience” and “ambiguous methodology”.

This article is to record how I solve this problem.

My benchmarking framework

Benchmarking methodologies like DDD step-by-step to create “benchmarking operations” that provide “clear instructions” is easier said than done. The vast majority of consultants have been able to break down the methodology in stages, but have not been able to break down and validate the methodology in a fine-grained way.

From my observation, the main reasons for this problem come from several aspects:

  • ** doesn’t do enough in-depth research on methodology: ** sells methodology as it is learned.
  • ** lack of iteration and polish: ** lack of opportunity for iteration and refinement, or lack of focus.

Another important reason is that most technical consultants have probably been away from writing code for too long to realize that methodology benchmarking is a lot like how we develop software:

  • First of all, it is necessary to start from the needs of customers, clear delivery target value and content.
  • Then, Tasking thoughts and stage acceptance conditions as the focus, the target was decomposed into different stages.
  • Next, detailed implementation of each stage is carried out to ensure that the acceptance conditions of each stage can be achieved in the simplest way in the implementation process.
  • Finally, the output of the first version of the minimum benchmark content, through continuous adaptation and polishing, iterative improvement, or a larger revision (similar to requirements change).

More importantly, this process can be done with the idea of “Test Driven Development”!

Use TDD to benchmark DDD design process

So how do I benchmark the DDD design process using TDD ideas? In this year, through more than a dozen consulting projects, large and small, I did the following:

  • In the first step, I polished and defined clear outputs for each stage through the practice of different client projects, and produced DDD Workshop Access Conditions and Outputs Legend. This step is equivalent to Tasking the input and output of a program, and defining test cases in test-driven development.
  • In the second step, after determining the input and output, I continued to benchmark the operation steps of each stage through continuous polishing of different projects, and refined each step into four parts: concept introduction, operation steps, process legend and key points, and produced DDD Workshop Operation Manual. This step is equivalent to a test-driven implementation of the program’s “processing” process and iterating over and over again through small iterations.
  • The third step, after the completion of the entire operation manual, based on the operation manual, reorganize and abstract the most simple and common concepts suitable for this operation manual, and optimize and define the new DDD piecewise design (strategic design stage, tactical design stage, technical implementation stage) from the whole macro. It solved the problems of messy, inconsistent and unsmooth knowledge system in all THE DDD training I had participated in before, minimized the concept explanation part, and produced the DDD Workshop Concept Explanation courseware. This step is equivalent to the process of hierarchical architecture and architecture evolution through a high degree of abstraction in the programming and development process.
  • The fourth step, through the project development practice and further summary, combined with a variety of dome-centered hierarchical architecture ideas, constantly polished to form the benchmark code for the whole benchmark DDD sample (github.com/howiehu/ddd…
  • The fifth step is to produce DDD Maturity Assessment Standard through continuous practice, polishing and summary.

In this way, the DDD design process is benchmarked step by step, incrementally, from the point of determining deliverables, much like the idea of “simple design (incremental design)” achieved through TDD when we do software design.

The results of

The thought and design process of DDD is open and has no reservation significance. Therefore, I choose to share with you here, in order to contribute to the implementation and improvement of DDD in China.

I’m also in the process of setting up a series of DDD benchmarking open materials, communities, and training that I will gradually post to Github’s Domain-driven organization at the following address:

  • Github.com/domain-driv…

The benchmark domain-driven design output mentioned in the article is as follows, and I will continue to polish and optimize it in the future:

  • DDD Workshop Access Conditions and Output Legend (Extraction code: 9JZA)
  • DDD Workshop Operation Manual (extract code: UU1D)
  • DDD Workshop Concept Explanation (Extraction code: B4FT)
  • Sample DDD Benchmark Code Structure
  • DDD Maturity Assessment Standard (still in progress, please look forward to it)

I have demonstrated the reusability of these benchmarking outputs by leading seven new consultants, who are able to independently undertake subsequent DDD design consulting work with only one explanation or demonstration from me and achieve consistency in concept and approach.

As for “By Experience,” only the operator’s personal Experience and IQ ceiling is left.


Welcome to follow my wechat official account

Wechat search: Guns and Code, or search the public ID: guns-N-code