Vst3 Audio Plugin Format Is Now Mit
Posted2 months agoActive2 months ago
forums.steinberg.netTechstoryHigh profile
excitedpositive
Debate
60/100
Audio ProcessingOpen-Source SoftwareVst3 Plugin Format
Key topics
Audio Processing
Open-Source Software
Vst3 Plugin Format
Steinberg has released the VST3 audio plugin format under the MIT license, generating excitement and discussion among developers and users about the implications for audio processing and open-source software.
Snapshot generated from the HN discussion
Discussion Activity
Very active discussionFirst comment
5m
Peak period
149
Day 1
Avg / period
32
Comment distribution160 data points
Loading chart...
Based on 160 loaded comments
Key moments
- 01Story posted
Oct 23, 2025 at 1:48 AM EDT
2 months ago
Step 01 - 02First comment
Oct 23, 2025 at 1:53 AM EDT
5m after posting
Step 02 - 03Peak activity
149 comments in Day 1
Hottest window of the conversation
Step 03 - 04Latest activity
Oct 31, 2025 at 2:00 PM EDT
2 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
ID: 45678549Type: storyLast synced: 11/22/2025, 11:47:55 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.
If you're planning to do that. Set aside a lot of time.....
[1] https://u-he.com/community/clap/
Also, CLAP uses 3 or 4 methods to represent MIDI data (MIDI1, MIDI1 + MPE, MIDI2, CLAP events). This requires to write several converters when implementing a host.
But, CLAP is much simpler and doesn't use COM-like system (VST3 resembles a Windows COM library with endless interfaces and GUIDs).
Also, VST3 interfaces in SDK are described as C++ classes with virtual functions (example: [1]), and I wonder how do they achieve portability, if the layout for such classes (vtables) is not standardized and may differ across compilers.
[1] https://github.com/steinbergmedia/vst3_pluginterfaces/blob/3...
If a compiler and linker don't follow those ABIs then it would also be close to useless for compiling or linking against shared libraries. So in the wild, all useful compilers do target the same ABIs.
gcc in mingw on windows is the odd duck, but most production software does not support it anyway.
I guess in C++ you are not supposed to link libraries produced by different compilers? Maybe you should use C-compatible interfaces in this case?
The C standard similarly does not specify an ABI.
Contrast to VST3 which doesn't support MIDI at all, unless you count cresting thousands of dummy parameters hardcoded to MIDI controller numbers "support."
This makes sense if you want to map a controller to a plugin parameter in a DAW. However, if you want to write a "MIDI effect", which transforms incoming MIDI data for controllers, it would be difficult.
Also it is interesting that VST3 has an event for note expression, and separate event for polyphonic pressure although it can be considered a note expression.
And nearly everyone except Steinberg considers this to be a mistake. MIDI messages (CCs, pitch bend, and so on) are _not_ parameters.
Or you could have an automation curve that produces a sine wave, and have a MIDI knob to modulate its amplitude, and send the result to an LPF "cutoff frequency" input, so that you can control amount of modulation with a knob.
So VST3 (and CLAP) treat each parameter as an additional plugin input, which can be routed arbitrarily.
If a plugin would instead have a single MIDI input and extract controller changes from a MIDI stream, then the scenario above would be more difficult to implement (you would need to convert an output of multiplication plugin into a sequence of MIDI controller changes and combine it with other MIDI events).
Am I missing something here? Never developed a single plugin.
The first is that VST3 does not exist in a vacuum. Plugin developers need to ship AU (and previously VST2 concurrently with VST3) plugins. MIDI still is the basic abstraction used by higher level wrappers for sending events to plugins, so in practice, all VST3 events get converted to MIDI anyway (for the majority of plugins).
The second thing is that parameter values are special. You do not actually want the modulation features you are talking about to touch the parameter value, you want them to be used as modulation sources on top of the parameter value. Most synths for example treat MIDI CC and pitch bend as modulation sources to the voice pitch or other parameters (programmable by the user), and not direct controls of parameters. Keep in mind that parameters are typically serialized and saved for preset values.
The third thing is that parameters not in practice allowed to be dynamic. So if you want to use a MIDI controller as a modulation source, you need a dedicated parameter for it. You cannot dynamically add and remove it based on user input.
As an example:
> Or you could have an automation curve that produces a sine wave, and have a MIDI knob to modulate its amplitude, and send the result to an LPF "cutoff frequency" input,
This is not possible in VST3, in any host I'm aware of. You cannot output parameters from one plugin to route to another until 3.8, and I highly doubt anyone will support this.
VST3 is less flexible and more surprising in how it handles midi which is why historically, VST3 plugins have very poor MIDI support.
I don't care much how it is made in other DAWs, I just tried to design a routing system for a DAW on a paper. I needed a concept that would be simple yet powerful. So I thought that we can define 3 types of data - audio data, MIDI data and automation (parameter) data. And every plugin may have inputs/outputs of these types. So if a plugin has 20 parameters, we can count them as 20 inputs. And we can connect inputs and outputs however we want (as long as the types match). Of course, we also have 3 types of clips - audio, MIDI and automation curve clips. And obviously we can process these data with plugins - so we can take a MIDI clip, connect it to a plugin that generates envelopes for incoming MIDI notes, and connect its output to a cutoff frequency of a filter. Why not?
Technically it is possible to process parameter data, we just have to deal with converting data between different formats - some plugins might have "control voltage" inputs/outputs, other allow changing parameters sample- or block-precise points. And here VST3, which has a defined model for parameter interpolation, is easier to deal with than plugin formats that do not define exact interpolation formula.
By the way now I noted a serious weakness in my model - it doesn't support per-note parameters/controllers - all parameters are global in my concept. Guess I need to think more.
Your point about modulating parameters is valid, however I am not sure if it is better to implement modulation in a host and have full control over it (what do we do if a user moves a knob when the automation is being played) or have every plugin developer to implement it themselves (like in CLAP which supports parameter modulation).
> This is not possible in VST3,
I think it is possible - the plugin gets a sequence of parameter changes from the host and doesn't care where they come from. As I remember, plugins may also have output parameters, so it is possible to process parameter data using plugins.
> And here VST3, which has a defined model for parameter interpolation, is easier to deal with than plugin formats that do not define exact interpolation formula.
So I'll just reiterate that this is not true for either plugin or host developers and that's not a minority opinion. The parameter value queue abstraction is harder to implement on both sides of the API, has worse performance, and doesn't provide much in benefit over sending a sparse list of time-stamped events and delegating smoothing to the plugin.
> As I remember, plugins may also have output parameters, so it is possible to process parameter data using plugins.
The host forwards those output parameters back to that plugin's editor, not to other plugins. You use this as a hack to support metering, although in practice since this is a VST3 quirk, few people do it. Until 3.8.0 which added the IMidiLearn2 interface there was no way to annotate MIDI mappings for output parameters, which caused hosts to swallow MIDI messages even if they should be forwarded to all plugins. I doubt that the new interface will be implemented consistently by hosts, and now there's a problem where old plugins may do the wrong thing in new versions of hosts that expect plugins to be updated (this is catastrophic behavior for audio plugins - you never want a version update to change how they behave, because it breaks old sessions). There's also no good way to consistently send what are effectively automation clips out of plugins, since the plugin does not have a view into the sequencer.
And most importantly - plugins aren't aware of other plugins. If one plugin outputs an parameter change it is meaningless to another plugin. Maybe if both plugins implement IMidiMapping2 the host can translate the output parameter change into a MIDI event and then into another parameter change. Sounds a lot stupider than just sending discrete MIDI events.
Essentially, the design of parameters in VST3 is fragile and bad.
VST3 only recently gained the `moduleinfo.json` functionality and support is still materialising. Besides, hosts generally do a much better job about only scanning new plugins or ones that have changed, and hosts like Bitwig even do the scanning in the background. The manifest approach is cool, but in the end, plugin DLLs just shouldn't be doing any heavy lifting until they actually need to create an instance anyway.
> Also, CLAP uses 3 or 4 methods to represent MIDI data (MIDI1, MIDI1 + MPE, MIDI2, CLAP events). This requires to write several converters when implementing a host.
I've not done the host-side work, but the plugin-side work isn't too difficult. It's the same data, just represented differently. Disclaimer: I don't support MIDI2 yet, but I support the other 3.
On the other side, VST3 has some very strange design decisions that have led me to a lot of frustration.
Having separate parameter queues for sample-accurate automation requires plugins to treat their parameters in a very specific way (basically, you need audio-rate buffers for your parameter values that are as long as the maximum host block) in order to be written efficiently. Otherwise plugins basically have to "flatten" those queues into a single queue and handle them like MIDI events, or alternately just not handle intra-block parameter values at all. JUCE still doesn't handle these events at all, which leads to situations where a VST2 build of a JUCE plugin will actually handle automation better than the VST3 build (assuming the host is splitting blocks for better automation resolution, which all modern hosts do).
duped's comment about needing to create "dummy" parameters which get mapped to MIDI CCs is spot-on as well. JUCE does this. 2048 additional parameters (128 controllers * 16 channels) just to receive CCs. At least JUCE handles those parameters sample-accurately!
There's other issues too but I've lost track. At one point I sent a PR to Steinberg fixing a bug where their VST3 validator (!!!) was performing invalid (according to their own documentation) state transitions on plugins under test. It took me weeks to get the VST3 implementation in my plugin framework to a shippable state, and I still find more API and host bugs than I ever hit in VST2. VST3 is an absolute sprawl of API "design" and there are footguns in more places than there should be.
On the contrary, CLAP support took me around 2 days, 3 if we're being pedantic. The CLAP API isn't without its share of warts, but it's succinct and well-documented. There's a few little warts (the UI extension in particular should be more clear about when and how a plugin is supposed to actually open a window) but these are surmountable, and anecdotally I have only had to report one (maybe two) host bugs so far.
Again, disclaimer: I was involved in the early CLAP design efforts (largely the parameter extension) and am therefore biased, but if CLAP sucked I wouldn't shy away from saying it.
Oh I forgot about parameters. In VST3, the parameter changes use linear interpolation. So the DAW can predict how the plugin would interpret parameter value between changes and use this to create the best piece-wise linear approximation for automation curve (not merely sampling the curve every N samples uniformly which is not perfect).
CLAP has no defined interpolation method, and every plugin would interpolate the values in its own, unique and unpredictable way (and if you don't interpolate, there might be clicks). It is more difficult for a host to create an approximation for an automation curve. So with CLAP "sample-precise" might be not actually sample-precise.
I didn't find anything about interpolation in the spec, but it mentions interpolation for note expressions [1]:
> A plugin may make a choice to smooth note expression streams.
Also, I thought that maybe CLAP should have used the same event for parameters and note expessions? Aren't they very similar?
> duped's comment about needing to create "dummy" parameters which get mapped to MIDI CCs is spot-on as well. JUCE does this. 2048 additional parameters (128 controllers * 16 channels) just to receive CCs. At least JUCE handles those parameters sample-accurately!
What is the purpose of this? Why does plugin (unless it is a MIDI effect) need values for all controllers? Also, MIDI2 has more than 128 controllers anyway so this is a poor solution.
[1] https://github.com/free-audio/clap/blob/main/include/clap/ev...
Can you link to any code anywhere that actually correctly uses the VST3 linear interpolation code (other than the "again_sampleaccurate" sample in the VST3 SDK)? AU also supports "ramped" sample-accurate parameter events, but I am not aware of any hosts or plugins that use this functionality.
> CLAP has no defined interpolation method, and every plugin would interpolate the values in its own, unique and unpredictable way (and if you don't interpolate, there might be clicks). It is more difficult for a host to create an approximation for an automation curve. So with CLAP "sample-precise" might be not actually sample-precise.
Every plugin does already interpolate values on its own. It's how plugin authors address zipper noise. VST3 would require plugin authors to sometimes use their own smoothing and sometimes use the lerped values. Again, I'm not aware of any plugins that actually implement the linear interpolated method. I think Melda? It certainly requires both building directly on the VST3 SDK and also using the sample-accurate helpers (which only showed up in 2021 with 3.7.3).
Anyway, I maintain that this is a bad design. Plugins are already smoothing their parameters (usually with 1 pole smoothing filters) and switching to this whole interpolated sample accurate VST3 system requires a pretty serious restructuring.
Personally, I would have loved having a parameter event flag in CLAP indicating whether a plugin should smooth a parameter change or snap immediately to it (for better automation sync). Got overruled, oh well.
> What is the purpose of this? Why does plugin (unless it is a MIDI effect) need values for all controllers? Also, MIDI2 has more than 128 controllers anyway so this is a poor solution.
Steinberg has been saying exactly this since 2004 when VST3 was first released. Time and time again, plugin developers say that they do need them. For what? Couldn't tell you, honestly. In my case, I would have to migrate a synth plugin from MPE to also be able to use the VST3 note expressions system, and I absolutely cannot be bothered - note expressions look like a nightmare.
And this is the chief problem with VST3. The benefits are either dubious or poorly communicated, and the work required to implement these interfaces is absurd. Again – 3 days to implement CLAP vs 3 weeks to implement VST3 and I'm still finding VST3 host bugs routinely.
It's worth mentioning that it's 2 x 16 x 16,384 in MIDI 2, + 128 x 16 MIDI1 because you gotta support both.
But to quote Steinberg devs, "plugins shouldn't handle MIDI CC at all"
COM is just 3 predefined calls in the virtual table. CLAP gives you a bunch of pointers to functions, which is similar.
COM can be as simple as that implementation side, at least if your platforms vtable ABI matches COM's perfectly, but it also allows far more complicated implementations where every implemented interface queried will allocate a new distinct object, etc.
I.E. even if you know for sure that the object is implemented in c++, and your platforms' vtable ABI matches COM's perfectly, and you know exactly what interfaces the object you have implements, you cannot legally use dynamic_cast, as there is no requirement that one class inherits from both interfaces. The conceptual "COM object" could instead be implemented as one class per interface, each likely containing a pointer to some shared data class.
This is also why you need to do the ref counting with respect to each distinct interface, since while it is legal from an implementation side to just share one ref count for it all, that is in no way required.
You're right that QueryInterface can return a different object, but that doesn't make it significantly more complicated, assuming you're not managing the ref-counts manually.
I think there's still a lot of bad feeling about the fact that there are many VST2 plugins that are open source but nonetheless illegal (or at least tortious) to build.
(Actually the idea is very old and how most LADSPA plugins were made but some time in the mid oughts everybody forgot about it.)
They becoming popular on the back of Diva, and Hans Zimmer using Zebra (he's very fulsome in his praise whenever mentioning u-he in interviews).
Making a new plugin standard that is gaining wide adoption is big in my head.
u-he's stuff (Diva, Repro, Zebra) has CLAP builds.
As another comment mentioned, I guess FabFilter plugs support CLAP now?
It is niche but I am watching it closely as I would love to get off of Windows or MacOS and CLAP plugins often have linux support.
https://cleveraudio.org/hosts-and-plug-ins/
It will be a while, if ever, before most plugins get the CLAP (pun intended).
Almost all VST plugins have an AU version (like 80%-90% or so, and 99% of the major ones).
Almost no VST plugins have a CLAP version (like 1%-5%, and that's charitable).
On the other hand, there is not a single DAW that only loads CLAP.
That's the same problem Steinberg faced with VST3 years ago, every host/DAW/plugin supported VST2 (including Cubase), there was no reason for devs to switch to VST3.
Steinberg forced the issue killing the VST2 licenses, any new plugin and host had only access to the VST3 license, even then devs resisted, only recently Steinberg announced future Cubase/Nuendo versions won't support VST2 anymore (plugin devs may hate Steinberg, but they won't simply leave Cubase/Nuendo users without support, they are not to blame for Steinberg's stupidity).
CLAP can't force the issue the same way Steinberg did with VST3, there is no CLAP-only DAW either.
The bigger problem are hosts. While Apple and Avid will probably never support CLAP, everyone but Ableton does. They move slower than the rest of the industry (taking a decade or so to implement VST3). Which is odd because CLAP is significantly easier to use from both the host and plugin side.
That said, you can wrap a clap plugin as a vst3 or AU today. It's probably the lowest friction way to do it to be honest.
previous examples:
* Yamaha saved Korg by buying it when it was in financial trouble and giving it a cash injection, only to then sell it back to its previous owners once they had enough cash[1].
* Yamaha in the 80's had acquired Sequential (for those not familiar: Sequential Circuits is one of the most admired synthesizer makers). Many years later, Sequential's founder Dave Smith established a new company under a different name and in 2015 Yamaha decided to return the rights to use the Sequential brand to Smith, as a gesture of goodwill, on Sequential's 40th anniversary (this was also thanks to Roland's founder Ikutaro Kakehashi who convinced Yamaha that it would be the right thing to do) [1][2][3]
[1] https://www.soundonsound.com/music-business/history-korg-par...
[2] https://www.gearnews.com/american-giants-the-history-of-sequ...
[3] https://ra.co/news/42428
yamaha: sure, here you go
customer: great, thanks! lol, I also need a motorcycle. Do you know where I can buy a good one?
yamaha: you're not gonna believe this...
(fun fact: the motorcycle Triumph and the undergarment Triumph are two entirely different companies that just happen to share the same name)
I see.
In that case, you'll appreciate the fact that the Three Musketeers chocolate bar bears no relationship to Alexander Dumas, the author of the famed book series featuring D'Artagnan and three musketeers.
You might also be interested to learn that Zenit launch vehicles are not made by the organization that produces Zenit optics and cameras.
Most crucially, Lucky grocery store chain in California turns out to be completely different from the Korean Lucky chemical products and electronics conglomerate (known as "Lucky GoldStar" after merging its chem and electronics wings, and, currently, "LG").
The more you know!
I think you meant Insult Comic Dog.
Nokia did manufacture rubber boots though, before they spun off the footwear division in 1990 and went all in on electronics.
This changed in 1988 with the formation of an LLC, in 1995 they went public and in 2003 shares still held by the parent company were sold off to Bridgestone.
They may have “different legal entities” but it’s the same.
https://en.wikipedia.org/wiki/Yamaha_Corporation
https://en.wikipedia.org/wiki/Yamaha_Motor_Company
It even says in your link "The former motorcycle division was established in 1955 as Yamaha Motor Co., Ltd., which started as an affiliated company but has been spun-off as its own independent company. "
I remember being confused when looking at high end saxophones that one was made by an old French company (that made sense, France makes many fine luxury goods including instruments) and the other was (in my mind) made by a motorcycle company. How could a motorcycle company possibly have compiled the expertise to make high end musical instruments when most musical instrument companies were chasing the low end of the market at the time?
But Yamaha music (1887) was started only 2 years after Selmer (1885). They got their start making reed organs. Reed organs (1) are technical, (2) make sound with reeds, and (3) are luxury items. So their expertise in sax (a reed instrument) and synthesizers (technical keyboard instruments) makes a ton of sense.
Also, you should note that Yamaha Corporation, the musical instrument maker and Yamaha Motor are now 2 distinct independent companies, even if were originally part of the same group.
They actually made at least one model, of which only 3 prototypes were built before the project was cancelled due to the economic situation at the time: the Yamaha OX99-11 supercar was using a detuned version of their formula one v12 engine when they switched to v10. I guess they had built to many of them?
https://usa.yamaha.com/products/proaudio/network_switches/in...
They have the technical capability to design one, but on the surface it is enough outside their core product line that I wonder if it is a oem rebadge.
Looks like switches came in 2011 and there's some secret sauce which makes them autoconfigure each-other to reduce networking setup.
It might not be a standard OEM stuff.
Citing the page:
> Yamaha entered the router business in 1995, and has grown to hold a significant share of Japan’s small to medium enterprise and SOHO network market. Yamaha gigabit L2 switches that could be linked to Yamaha routers/firewalls were introduced in 2011, with features that significantly reduced the network setup, maintenance, and management workload.
https://www.yamaha.com/en/about/history/logo/
Customer: "So, I need some huge IGBTs for an electric train motor, I need a 44-tonne excavator to lift the train, I need a new stereo to listen to while I fix it, and I need an, uhm, 'personal massager' to relax afterwards"
Sales guy: "Here's our catalogue, page 40, page 32, page 108, and page 7. Let me know what colours you want."
It's worth a watch.
On another note, it's very telling that companies that protect their "hey! we do this interesting thing, gonna buy?" character survives for much longer compared to companies which say "we can earn a ton of money if we do this".
The companies in the second lot does a lot of harm to their ecosystems to be able to continue existing.
They give a good service because they respect the thing they built and the person who cares for it. They want their products to live and bring joy to people who guards and cares for them.
This is a completely different and much deeper philosophy.
This is not limited to Japanese companies though. A Lamy representative told me that the factory in Germany restored an out of production fountain pen to mint condition, for free.
Want a bit for your early 50s Mercedes? Your local dealer knows a guy in Stuttgart who will send it over.
Want a bit for your early 30s Mercedes? Your local dealer knows a guy in Stuttgart who knows a guy who will *just go and make you one*, albeit it will be priced accordingly.
Or the customer will be Austrian and there is nothing for it but assassinating archdukes and bloody warfare over a 1.50EUR gasket.
Despite combing the Earth for that fifth wheel for the spare, he was never able to find one but, since the spare tire was kept underneath a trimmed cover on the back of the car, it wasn't missed. The instrument dials he faked by carefully photographing, and printing on photographic paper, a set of dial faces.
Those rubber window moldings?
He was able to have a well-connected German friend contact Mercedes in Germany where they responded by breaking out the original molds and producing two complete sets for the car.
The year after the car was complete, it won a local Mercedes-Benz of America concours show by 4/100ths of one point over a Gullwing that had just gone through a $200,000 restoration.
Still they're a separate legal entity and their HQ, development, support, etc. are still located in Hamburg like they used to be since the early-mid 1980s when they released their MIDI sequencing software for Atari ST (Steinberg Twenty 4 I believe it was called?). I guess you could do worse than being bought by Yamaha, but I think this decision isn't related to it.
They are really well respected in professional music circles. I don't like their tenor saxes, but man they made some great altos and sopranos, including the mid tier ones.
Their speakers i think are lovely examples of their engineering quality. Great and honest sound, some of the best out there, and they are not super over-priced. Also ,they are super repairable. Had some really bad experiences with other brands which were, more expensive for a more biassed sound, had 'black gunk' over the PCBs as some kind of anti-repair mechanism. (overheats the boards too! ew!) and other crappy issues.
Cool to hear there's such a story behind the quality. Makes sense!
1. Great news! VSTs seem to fill an important role in the audio-processing software world, and having them more open must be a good thing.
2. From the things they mention, the SDK seems way larger than I had imagined, but that is normal for (software) things, I guess. "This API also enables the scheduling of tasks on the main thread from any other thread." was not easy to unpack nor see the use of in what was (to me) an audio-generation-centered API.
3. The actual post seems to be somewhat mangled, I see both proper inline links and what looks like naked Markdown links, and also bolded words that also have double asterisks around them. Much confusing.
VST plugins almost all have a GUI, thus the VST SDK has to support an entire cross-platform UI framework... This threading functionality is mostly about shipping input events/rendering updates back and forth to the main (UI) thread
Like traveling back in time before UX was a word
JUCE is a popular UI framework (at least it was 10 years ago). But I've seen people put electron apps somehow into a VST.
And yet here we are discussing the value of using C++ vs other languages for real time audio processing.
The basic threading model for plugins is the "main" and "audio" threads. The APIs specify which methods are allowed to be called concurrently from which thread.
There is also a state machine for the audio processing bits (for example you can guarantee that processing won't happen until after the plugin as been "activated" and won't go from a deactivated state to processing until a specific method is called - I'm simplifying significantly for the VST3 state machine).
The "main" thread is the literal main/UI thread of the application typically, or a sandboxed plugin host running in a separate process. You do your UI on this thread as well as handle most host events.
Plugins often want to do things on background threads, like stream audio from disk or do heavy work like preparing visualization without blocking the main UI thread (which also handles rendering and UI events - think like the JS event loop, it's bad to block it).
The threading model and state machine make it difficult to know where it's safe to spawn and join threads. You can do it in a number of places but you also have to be careful about lifetimes of those threads, most plugins do it as early as possible and then shut them down as late as possible.
The host also has to do a lot of the stuff on background threads and usually has its own thread pool. CLAP introduced an extension to hook into the host's thread pool so plugins don't have to spawn threads and no longer have to really care about the lifetime. VST3 is copying that feature.
When you see annotations on methods in these APIs about "main" vs "any" thread and "active" etc they're notes to developers on where it is safe to call the methods and any synchronization required (on both sides of the API).
If it sounds complicated that's because it is, but most of this is accidental complexity created by VST3.
Concretely, it made distributing OSS VST plugins a pain. Especially for Linux which generally will want to build their packages.
Steinberg is only going to benefit from this, I think.
That was with VST2, which is/was a proprietory format. VST3 has been dual licensed as GPLv3 + commercial for a long time now.
The real value IMHO of github is the issue tracker and the visual diff/display of PR changes.
As a composer and arranger working with different studios, I need multiple DAWs installed for compatibility. Every time I open my DAW or Gig Performer after a few days, it rescans all plugins. With around 800 installed, that happens across AU, VST, and VST3.
I hope Apple and Avid are holding meetings after this decision to help simplify the life of library/plugin makers. As an example AAX requires a complete mess to compile and test their plugins, and several AU plugins are just wrappers aroud VST that add another layer.
I really hope the next five years bring real standardization and smoother workflows.
I have always felt the industry of digital audio processing and software to be overcommercialized and ripe for something akin to blender to change the game.
Audio programming is still low level and difficult, but I'm looking forward towards vibe coding some experiments with this.
I like to work privately on my open source projects, and then push it to a public repo after deleting my git history ( for the public repo anyway).
I wonder if that will change as well..
At the same time Steinberg also open sourced their ASIO audio hardware interface standard but under GPL3. GPL2 here would have made more sense to me to align with the Linux kernel GPL2 only licensing. So why GPL3? Other commenters here have mentioned OBS, and OBS is "GPLv2 or later" so sure that works for them. Not being GPL 2 and missing on the Linux kernel just surprises me.
I have been using the nice cwASIO (https://github.com/s13n/cwASIO) re-implementation of the ASIO SDK, it's MIT licensed. https://github.com/s13n/cwASIO. It's nice there just to see something more up to date than the ancient ASIO SDK documentation. I would love to see the Steinberg ASIO SDK updated and improved, if you are listening Steinberg folks: nobody cares about the history of ASIO on Macs or Silicon Graphics Workstations, just dive in and get deep into the weeds of ASIO on Windows, and include lots more sample code, especially covering the ASIO device enumeration mess on Windows.
Also the license change could be caused by competition from CLAP which is very openly licensed.
17 more comments available on Hacker News