Link: medium.com/flutter-com…

Author: Medium.com/@mksl

Published: July 30, 2021-9 minutes reading

Introduction to the

With the advent of DFI in Dart, and now WebAssembly FFI interworking, creating rich, non-GUI applications using Dart on desktop operating systems such as Linux, Windows, and MacOS, Even creating applications on “large embedded” devices like Raspberry Pis offers new possibilities.

In this article, I’d like to show you how I’m using the new Dart WASM package to build a prototype of an audio synthesizer based on Dart, a rich existing codebase developed with AssemblyScript that can serve as a working example of how to use Dart’s AssemblyScript code.

The as – audio from Synth1

For me, this particular journey began when I first saw Peter Salomonsen’s wonderful work on creating browser-based synthesizer music in this great talk he gave in February 2020. While Peter’s AssemblyScript code (translated to Wasm) was cool, it didn’t really fit the Dart hardware-based audio experimental project I was already working on, so while I played a fun online demo and learned a lot from watching the talk and looking at the code, I moved on to other things.

It wasn’t until I stumbled across the Dart Wasm package under development that I saw a way to leverage some of Peter’s excellent Wasm “Synth1” codebase.

Although I want to use synth1 as-is, but considering that it is to run in a web browser environment, this means that some parts did not correlate with what I am doing, and other parts, such as the MIDI processing and sorting, I already handled in their own Dart code, so I decided to create a new mini project, called the as – audio, I can bring in the parts of the Synth1 AssemblyScript code base that I need one by one and easily build a separate WASM binary.

Use AssemblyScript Wasm from Dart

Even in its pre-release state, the wasm package has good documentation and examples of using wasm compiled by C++, but there is understandably no documentation on how to use wasm compiled by AssemblyScript. So, in order to understand how to use AS Wasm from Dart in the first place, it’s time for me and the reader to shift our attention to the AssemblyScript itself.

Assemble what?

AssemblyScript describes itself as “designing for WebAssembly. AssemblyScript, specifically for WebAssembly, gives developers a low level of control over their code. In terms of syntax, I think the language is basically a subset of Typescript, with the emphasis on strict typing in a way that makes it appropriate to compile to Webassembly.

Another advantage of AS from my perspective is that since I have a background in C, Java, JS, Kotlin, and Dart, I find it easy to get started, it does have a simple setup toolchain to compile to binary WASM (AS advertised on its website, NPM installation only), and importantly, It is also very flexible in areas such as memory management.

Since the focus of this article is on using wASM from Dart, which happens to be compiled from AS, I won’t go into further detail on how I set up my as-Audio project to build binary.wasm files, but will instead direct interested readers to as-Audio’s Github repo.

Dart and Wasm are not yet available

Once I have the as-audio Settings for the initial version to create a binary.wasm file, I’m ready to start trying it out from Dart. But before I started calling the Webassembly code, I decided I wanted a simpler starting reference point, so I first created an interface for the oscillator.

abstract class Oscillator { 
  double next(); 
}
Copy the code

You then go ahead and create the Dart port for the AS code for something like the “HelloWorld” of the audio synthesis code, a sine wave oscillator.

class DartSineOscillator implements Oscillator {
  int position = 0;
  final double frequency;
  final sampleRate;

  DartSineOscillator(this.sampleRate, this.frequency);

  @override
  double next() {
    final ret = sin(pi * 2 * (position) / (1 << 16));
    position =
        (((position) + (frequency / sampleRate) * 0x10000).toInt()) & 0xffff;

    returnret; }}Copy the code

This means that I now have a way to generate sound samples and test their playback before I need to deal with any complex issues using Webassembly (more on that soon).

Another name for a rose

Finally, I can try using Dart’s WASM module at this point. As described in the excellent announcement/tutorial article on the new wASM package, it is very simple to use a compiled binary module for WASM.

var wasmfile = Platform.script.resolve(wasmfilepath); 
var moduleData = File(wasmfile.path).readAsBytesSync(); 
WasmModule _wasmModule = WasmModule(moduleData); 
WasmInstance _instance = _wasmModule.builder().build();
Copy the code

Again, it shows that there is a very good way to get a list of imports and exports for all modules in the package.

print(module.describe());
Copy the code

In my as-Audio module example, you get.

export global: var float32 SAMPLERATE 
export global: const int32 SineOscillator 
export function: int32 SineOscillator#get:position(int32) 
export function: void SineOscillator#set:position(int32, int32) export function: float32 SineOscillator#get:frequency(int32) 
export function: void SineOscillator#set:frequency(int32, float32) export function: float32 SineOscillator#next(int32) 
export function: int32 SineOscillator#constructor(int32) 
...
Copy the code

I’ll come back to SAMPLERATE later, but for now if you compare the above to the source of the SineOscillator, AS.

export class SineOscillator { 
  position: u32 = 0; frequency: f32 = 0; 
  next(): f32 { 
    let ret = sin(PI * 2 * (this.position as f32) / (1 << 16 as f32)); 
    this.position = (((this.position as f32) + (this.frequency / SAMPLERATE) * 0x10000 as f32) as u32) & 0xffff; 
    return ret asf32; }}Copy the code

One thing you will notice, which is not covered in the example of the C code “Brotli “in this article, is that if you compare the source code with the exported list, the exported function names are not exactly the same AS in the AS code. This is because WebAssembly has no concept of OOP (object Oriented programming), so there is no concept of AS classes. Because of this, the AS compiler “mashups” these names, encoding both the class name and the method name in the resulting Wasm function name.

This process should be fairly familiar to anyone who has ever dealt with the C to C++ code interface and the similar name processing used by C++ compilers by default, although in the case of AS, the name processing scheme is pretty straightforward, AS you can see above.

Another thing that is different when it comes to interfaces with WASM compiled with AS and wASM compiled from C code is the odd case of adding an extra INT32 parameter to the exported WASM function. Again, this may be familiar to developers with an OOP background, as it is a “classic” OOP trick to add an implicit ThisReference to every method call on an object. But where do you get references to objects? Of course, this comes from calling the return value of the object class constructor, so instantiate and use a sinusoidal oscillator class in Dart, and you get the idea.

final cons = helper.getMethod(oscClassname, 'constructor'); 
// calling a Assemblyscript constructor returns a i32 which is "reference" to the object created by it 
_oscObjectRef = cons(0);
_setFrequency = helper.getMethod(oscClassname, 'set:frequency');
Copy the code

The Helper simply encodes our understanding of the AS name hybrid scheme.

dynamic getMethod(String className, String methodName) { 
  return _instance.lookupFunction('$className#$methodName');
}
Copy the code

I purposely use the setting frequency AS the example above, because it also demonstrates that there is no magic to AS object properties, and that the AS compiler simply defines implicit getter and setter methods using the name promiscuous convention of the colon separator, which should be familiar to Dart users, who do much the same thing. More details on how AS handles the names of exported functions, including how to customize it, can be found in its very good documentation.

Native FFI detour

Of course, the generated audio samples are of little use if there is no way to play the audio samples, so I need a way to send the audio samples to my computer’s audio card. While there are several audio plug-ins available for use in the Flutter application, unfortunately they all focus on the admittedly more common use cases of playing audio (usually compressed audio) files or network streams rather than the small audio sample sets that are generated in real time.

Due to the above limitations of the existing Flutter plugin, I have already started building a DFI wrapper for the standard Linux ALSA Asound library (FFI again!). . But while I did get initial play work, I was very pleased to find that in the meantime, someone had done the same thing for Libao, which not only provided a better/simpler API for ALSA, It also provides an API for the PulseAudio sound server and a few other operating systems (MacOS, Windows, etc.), so I decided to use it to play.

Using the Libao package is as simple as playing audio samples, basically including setting some initial parameters and turning on the playback device.

const bits = 16; 
const channels = 2; 
const rate = 44100; 
final device = ao.openLive( driverId, bits: bits, channels: channels, rate: rate, matrix: 'R' );
Copy the code

Next is the actual playback of the audio sample buffer we generated earlier in the synthesizer code.

for (var i = 0; i < rate; i++) {
    final sample = (osc.next() * volume * 32768.0).toInt();
    // Left = Right.
    buffer[4 * i] = buffer[4 * i + 2] = sample & 0xff;
    buffer[4 * i + 1] = buffer[4 * i + 3] = (sample >> 8) & 0xff;
  }

ao.play(device, buffer);
Copy the code

A complete and simple example of how to call the AS WASM code from Dart and then pass it to Libao for playback can be found here. I talked more about using Dart FFI with native libraries in a previous article.

The overall situation is also suitable for guests

After solving the basic problem of calling wASM functions from Dart, one of my initial snags was that the AS synthesizer code used “object balls,” and I found that in its initial pre-release state, the WASM package did not disclose an API for this. However, to my great surprise, within a few days of asking this question, the Dart team implemented this feature! It made me feel very happy.

So, what are guest globules and why do I need them?

As described in the Wasmer documentation, they are, as their name implies, global variables that WebAssembly code can expose to its host environment to read/write. In the case of as-Audio code, this is required in order for the host environment to be able to set the audio sampling rate used.

export const SAMPLERATE: f32 = 44100;
Copy the code

No, there’s no speed requirement

While using the AS compile wasm is probably a reasonable performance (although I haven’t really bother to do any benchmark), I would like to point out that I didn’t do this motivation for performance reasons, but in order to be able to use the existing, after a good writing and testing of audio processing code, I can in my Dart used in the application. Also, as Michael Thomsen pointed out in his article announcing the WASM package.

However, because C modules are platform specific, distributing a shared package with native C modules is complex: it requires a common build system, or distributing multiple binary modules (one for each required platform). Distribution would be much easier if a single Wasm binary assembly format were available on all platforms. So, instead of compiling your library into platplate-specific binaries for each target platform, you can compile it into a Wasm binary module and run it everywhere. This will potentially open the door for easy distribution of packages containing native code on pub.dev.

The road continues

I’m just starting to plan on transferring all of Synth1’s oscillators, filters, effectors, and even instruments to As-Audio and then exposing them to Dart, but if you’d like to use it or track my progress, the code is out there and you’re certainly welcome to come up with a PR

While I’ve covered the basics and a basic working example of WASM generated from Dart using AS here, this isn’t enough for real use in a real synthesizer application, because all the code runs in the same single Dart run loop and uses a blocking audio output API, This means that if we try to play audio continuously while generating audio samples in real time, we will soon hear a serious breakdown of the audio output.

To do that, we need to use Dart To effectively use multi-threaded concurrent execution, which I’ll cover in the next article, and subscribe if you want to be notified in the article.


The original article appears on manichord.com.

Attention on Twitter Flutter community, learn more fascinating article: www.twitter.com/FlutterComm


www.deepl.com translation