“This is the third day of my participation in the First Challenge 2022, for more details: First Challenge 2022”.
preface
In the previous article, I tried to transplant the Android Palette to Flutter. When the initial effect was completed and a milestone was reached, I wrote an article excitedly.
But in the follow-up research on how to use this plug-in, I found that there is already …………
PaletteGenerator
Once again, the truth is brutally clear:
Don’t doubt Github’s capabilities. If you don’t find the appropriate control, try again in a different language with a different keyword
But at least you can take comfort in the fact that you have learned how to calculate the theme color and the corresponding algorithm steps.
The ImageProvider and The PaletteGenerator work together to obtain the theme color. The ImageProvider and the PaletteGenerator work together to obtain the theme color.
How to use the palette_generator
Using palette_generator is very, very simple. Just call the fromImageProvider and pass in the ImageProvider that provides the image, AssetImage, NetworkImage, FileImage, etc.
Of course, you can also pass in some configuration parameters, modify the calculation of the image and the calculation range and so on; To specify, also described in detail in the comments:
Parameters such as size and region can be very useful and can greatly reduce the calculation time. After all, the theme color does not need to be too accurate image, cropping and compression can be used.
Of course, if you already have the Image object itself, or if you have some custom requirements that you need to do something with the Image, then the method to use is fromImage:
Here you can see several parameters from the Android Palette, such as the maximum number of colors, filters, etc. In general, it will be the Android Palette.
So it’s time to go to the Image section, and there’s this part that you need to know:
What is ImageProvider and what does it do
In fact, ImageProvider comments have been the basic process of ImageProvider is very clear:
A simple translation:
1. ImageProvider constructs an ImageStream stream as part of the processing callback when the image parsing load is completed; The consumer listens on the Stream to get data;
An obtainKey method is provided to return a key value based on the configuration information. This key value is used as an identity tag for the image stream, which can be used by subsequent processes such as parsing and releasing. A mechanism is also provided to allow asynchronous errors to be returned to the current Zone so that a try catch can catch them
PS: um… For those of you who don’t know what this sudden Zone is, here’s a quick summary:
This Zone can be interpreted as the environment in which the asynchronous operation is currently running. It’s a bit abstract to say this directly, but that’s pretty much what I understand, but each asynchronous operation is in a different environment; Therefore, in the general process, try catch cannot catch asynchronous errors, because the calling and asynchronous operation are in a different Zone, i.e., different environment. Try catch just catches errors in its own environment;
Ok, this Zone mechanism and asynchronous mechanism, I will talk more about later, this thing is not a sentence or two feeling can be explained, here as the environment understanding; Those of you who have worked on error capture and reporting of Flutter will be familiar with it. Those who are interested in Flutter can learn about it. I won’t bother here;
3. If the obtainKey method above returns the key value successfully, it means that the parser can go online and do image parsing. During this process, the data source will be lost due to various operations such as interruption or deletion.
4. The method for parsing is load, which decodes the image and returns it through the stream created in the first step. If there is an error, it is also thrown through the above error throwing process;
How does the PaletteGenerator drive the ImageProvider and retrieve the decoded data
Coming to the ImageProvider call, you can see that the first step is to create and listen to the flow:
HMM… The first step is actually to create the flow and create the Completer to pass the results of the listening, and the second step is to start the listening;
Then it’s just a matter of waiting for the image decoding to complete:
Remember the part from the last article? So you get the Image that contains the pixel data of the Image after decoding;
After the operation is step by step analysis, here is no longer described;
conclusion
It’s time to show the effect of international practice:
Well, the effect is ok, especially this “The Best Son-in-law”, extracted is hulun Buir grassland theme color, combined with the novel name, can not help but let people think;
But the downside is pretty obvious, because it’s lazy to use FutureBuilder directly… In addition, the initial default background color is black, and the color change is a bit abrupt, like a flash of feeling…
Maybe I should use Animation+ colorTween to do a gradient?