This article is mainly to explain the idea of implementation, so the code in the article is part of the codeClick here for the full source codeTo view

First, let’s look at the interaction effect we need to achieve

Two, the function point involved

  • Maximum length of track display, here is 3 minutes (3 minutes can be configured freely)
  • Minimum length of video interception, here is 3 seconds (3 seconds can be freely configured)
  • When the video is longer than three minutes, the bottom of the track can be rolled
  • Touch event handling
  • When the frame image at the bottom of the track is sliding, the time period of interception is calculated in real time
  • Timeline animation

Third, we need to disassemble the View first, so that we can customize the View

It can be divided into the following parts:

  • Drag the ear to the left
  • Drag ear on right
  • The red rectangle in the middle is selected
  • The shadow of the unselected section
  • A time line

4. Through inheritanceCustomPainteruseCustomPaintTo complete the effect drawing

The component structure is as follows:

  @override
  Widget build(BuildContext context) {
    return Container(
      height: 48,
      width: double.infinity,
      child: LayoutBuilder(
        builder: (context, constraints) {
          /// Get the size of the canvas here
          _initView(constraints);
          returnGestureDetector( onHorizontalDragDown: (down) {}, onHorizontalDragUpdate: (move) {}, onHorizontalDragEnd: (up) {}, child: Stack( children: [ Positioned( left: earSize.width, right: earSize.width, child: SingleChildScrollView( scrollDirection: Axis.horizontal, child: _getImageChild(), ), ), CustomPaint( size: viewSize, painter: VideoTrackPainter(), ), ], ), ); },),); }void _initView(BoxConstraints constraints) {
    if (rightEarOffset == null) {
      /// Canvas size
      viewSize = Size(constraints.maxWidth, constraints.maxHeight);
      /// The size of the middle track
      trackSize = Size(viewSize.width - earSize.width * 2, viewSize.height); }}Copy the code
  • throughLayoutBuilderComponent gets the size that this component can use, and then assigns that size toCustomPaint; So in theCustomPainter#paint()You can get the size of that and you can draw directly within that size
@override
void paint(Canvas canvas, Size size) {
/// the size is the size that we set
}
Copy the code
  • throughGestureDetectorTo complete the gesture listening, we just need to handle the horizontal drag event
  • useStackLayout to completeScrollable frame picture component at the bottomandDraggable elements at the top
  • As you can see from the renderings, the bottom scrollable frame assembly should be inside the two draggable ears, so both sides need to be retracted in the size of the ears

Five, the next is toDraggable elements at the topOne by one

  • Draw the left ear
    • Start drawing from the upper-left (0,0) coordinate of the canvas; So what I’m drawing here is just a rectangle except the left ear, the left corner is rounded; The right ear. The right ear needs to be rounded
    • You also need to draw a white rounded rectangle in the middle of the rectangle
  @override
  void paint(Canvas canvas, Size size) {
    _createEar(canvas, leftEarOffset, true);
    _createEar(canvas, rightEarOffset, false);
  }
  /// Create both ears
  void _createEar(Canvas canvas, Offset offset, bool leftCorner) {
    Rect rect = offset & earSize;
    Radius radius = Radius.circular(6);
    RRect rRect = RRect.fromRectAndCorners(
      rect,
      topLeft: leftCorner ? radius : Radius.zero,
      bottomLeft: leftCorner ? radius : Radius.zero,
      topRight: leftCorner ? Radius.zero : radius,
      bottomRight: leftCorner ? Radius.zero : radius,
    );
    earPaint.color = Color(0xFFFF443D);
    canvas.drawRRect(rRect, earPaint);

    /// The white rectangle
    Rect whiteRect = Rect.fromCenter(
        center: Offset(offset.dx + rect.width / 2, offset.dy + rect.height / 2),
        width: earWhiteSize.width,
        height: earWhiteSize.height);
    earPaint.color = Colors.white;
    RRect whiteRRect = RRect.fromRectAndRadius(whiteRect, Radius.circular(4));
    canvas.drawRRect(whiteRRect, earPaint);
  }
Copy the code
  • Draw the right ear
    • The calculation formula of the coordinates on the right is:RightEarOffset = Offset(width of the canvas - width of the ear, 0);
  • Draws the middle rectangle
    • Coordinate calculation formula:Rect.fromLTRB(left ear offset + ear width, 0 + 1, right ear offset, canvas height - 1);

    The +1 and -1 are used to indent the top and bottom borders to the middle because the brush has width

	  /// Create the middle rectangle
  void _createRect( Canvas canvas, Size size, Offset leftEarOffset, Offset rightEarOffset) {
      double left = leftEarOffset.dx + earSize.width;
      double right = rightEarOffset.dx;
      /// The width of the line
      double top = leftEarOffset.dy + 1;
      double bottom = size.height - 1;
      Rect rect = Rect.fromLTRB(left, top, right, bottom);
      canvas.drawRect(rect, rectPaint);
  }
Copy the code
  • Draw shadow masks on the left and right sides
    • This is essentially a gray rectangle from the left to the left ear and from the right ear to the right, as follows:
  void _createMaskLayer(Canvas canvas, Size size) {
    Rect leftRect =
        Rect.fromLTWH(earSize.width, 0, leftEarOffset.dx, size.height);
    canvas.drawRect(leftRect, maskPaint);
    Rect rightRect = Rect.fromLTWH(rightEarOffset.dx, 0,
        size.width - rightEarOffset.dx - earSize.width, size.height);
    canvas.drawRect(rightRect, maskPaint);
  }
Copy the code
  • Now, when I draw the timeline, I want to make it clear that the calculation of the interception period; This is how long the middle rectangle is selected (start time to end time).
    • Calculation formula of the start time point:Double startSecond = Track width/video duration in seconds * the offset of the left ear
    • Calculation formula for the end time point:Double endSecond = startSecond + (right ear offset - (left ear offset + ear width))/(track width/video duration (s))
  • Draw a timeline

    The timeline requires an animation to move from the start point to the end point and then continue in the loop, so all you need to know here is where the timeline started and where it ended and when it moved

    • The starting point and ending point of movement are calculated as follows:Double begin = offset of left ear + width of ear.Double end = offset of right ear
    • Length calculation formula: This has been said above, so only needEnd Time - Start timeCan be
    • At last,AnimationControllerStart the timeline animation
/// Draw the timeline
void _createTimeLine(Canvas canvas, Size size) {
  Offset start = Offset(timelineOffset.dx, 0);
  Offset end = Offset(timelineOffset.dx, size.height);
  canvas.drawLine(start, end, timelinePaint);
}

/// Time line animation
startTimelineAnimation() {
  int selectDuration = selectEndDur.inMilliseconds - selectStartDur.inMilliseconds;
  _timelineController = new AnimationController(
      duration: Duration(milliseconds: selectDuration), vsync: this); CurvedAnimation curve = CurvedAnimation(parent: _timelineController! , curve: Curves.linear); Animation animation = Tween(begin: leftEarOffset.dx + earSize.width, end: rightEarOffset! .dx) .animate(curve); animation.addListener(() { setState(() { timelineOffset = Offset(animation.value,0); }); }); _timelineController? .repeat(); }Copy the code

6, gesture processing to make the element move, the beginning has said useGestureDetectorComponent to handle

  • When we receive a touch event, we need to change the offset of the left ear, the right ear and then passsetState(() {});Do a state refresh, which gets the element moving. So here’s the problem: How do you know if you’re touching to the left or right?

The answer is: Rect’s contains() function determines whether the point is within the rectangle, so you know whether the finger is touching the left or the right

  • Determine whether the left or right ear offset should be changed when pressed, as follows:
_onDown(Offset offset) {
  double dx = offset.dx;
  if (dx <= 0) dx = 0;
  /// Determine which one is selected
  Rect leftRect = leftEarOffset & earSize;
  if (leftRect.contains(offset)) {
    touchLeft = true;
    return;
  }
  Rect rightRect = rightEarOffset! & earSize;
  if (rightRect.contains(offset)) {
    touchRight = true;
    return; }}Copy the code
  • Then change the offset and refresh the state while sliding to achieve the drag effect

There’s another problem with touch events: the beginning says use throughStackComponent to achieve the effect, which means that all touch events will be topCustomPaintThis causes the bottom frame to be unable to swipe left and right to select a range when the video is displayed beyond the set 3 minutes.

  • That’s where the above question comes inThe incident distribution of Flutter was handledHere is not a detailed expansion of the interested can look up the information
  • Here the conditions for event handling are simple:
    • When the touch position is notRight and left ears, you need to continue passing the event down, and then letSingleChildScrollViewThe component can be handled by itself
    • So here we need to customizeRenderBoxPay equal attention to writinghitTest()Function to distribute the event

So how do we customize thisRenderBoxTo do that?

  • The simplest and most convenient is, of course, customizationCustomPaintandRenderCustomPaintAnd then rewriteCustomPaint#createRenderObject()Function returns customRenderCustomPaint; So I can rewrite ithitTestFunction for logical processing can be, the code is as follows:
class TrackCustomPaint extends CustomPaint {
  const TrackCustomPaint({
    Key? key,
    CustomPainter? painter,
    CustomPainter? foregroundPainter,
    Size size = Size.zero,
    bool isComplex = false,
    bool willChange = false,
    Widget? child,
  }) : super( key: key, painter: painter, foregroundPainter: foregroundPainter, size: size, isComplex: isComplex, willChange: willChange, child: child);

  @override
  TrackRenderCustomPaint createRenderObject(BuildContext context) {
    returnTrackRenderCustomPaint( painter: painter, foregroundPainter: foregroundPainter, preferredSize: size, isComplex: isComplex, willChange: willChange, ); }}class TrackRenderCustomPaint extends RenderCustomPaint {
  TrackRenderCustomPaint({
    CustomPainter? painter,
    CustomPainter? foregroundPainter,
    Size preferredSize = Size.zero,
    bool isComplex = false,
    bool willChange = false,
    RenderBox? child,
  }) : super(
          painter: painter,
          foregroundPainter: foregroundPainter,
          preferredSize: preferredSize,
          isComplex: isComplex,
          willChange: willChange,
          child: child,
        );

  @override
  bool hitTest(BoxHitTestResult result, {required Offset position}) {
    VideoTrackPainter trackPainter = painter as VideoTrackPainter;
    returntrackPainter.interceptTouchEvent(position); }}Copy the code
  • interceptTouchEvent()Specific implementation: is to determine whether the current touch position isRight and left ears
bool interceptTouchEvent(Offset offset) {
  Rect leftRect = leftEarOffset & earSize;
  Rect rightRect = rightEarOffset & earSize;
  return leftRect.contains(offset) || rightRect.contains(offset);
}
Copy the code

Now there is only one last problem to deal with: when sliding the bottom frame of the picture you need to calculate the current position of the capture in real time

  • Here is an example with actual data:

    Assuming that the edited video is 4 minutes and the maximum display time of the track is 3 minutes, it can be understood that the distance of the bottom frame image beyond the track represents 1 minute, so we can calculate the start and end time of the video captured when sliding the bottom frame image in this way

  • The calculation formula is as follows:
/// The bottom can not be scrolled, so there may be a case of 0, so we need to deal with it
double scrollerSecond = 0;
double perScrollerSecond = _calcScrollerSecond();
if(perScrollerSecond ! =0) {
  scrollerSecond = _scrollController.offset / perScrollerSecond;
}

/// Calculate the offset per second
  double _calcScrollerSecond(a) {
    intDiffDuration = Video duration (s) - The duration of the track display (s);if (diffDuration == 0) return 0;
    return _scrollController.position.maxScrollExtent / diffDuration;
  }
Copy the code
  • The bottom slide time has been calculated, so you just need to add this time to the start time calculated above. The final start time calculation formula is:Double startSecond = scrollerSecond + Track width/video duration in seconds * the offset of the left ear

Eight, here the video editing track UI is finished, mainly reflects a realization of ideas, specific code can be viewed hereGitHub-VideoCropTrack