.. in deep math coding mode searching for new sounds to hear.
Hypercube VST (beta)
Hypercube VST is a parameter reducer plugin for VST2 instruments. It reduces the number of synthesizer/effect parameters to just three (3) which makes it easier to find new sounds than changing 100+ parameters that may be available through the normal user interface.
In order to to do parameter reduction a VST module must have roughly more presets of good instruments/effects than there are synthesizer parameters. The existing good presets are used to find "three dimensional space of good sounds", which can be then explored by using only three parameters: X, Y and Z. (Software supports three different ways to do parameter reduction: linear ICA, t-SNE and variational autoencoder VAE.)
The software is a beta version so there is room for improvements but you can already use it to find interesting sounds. AI code requires lots of CPU so you should have very fast computer to use the tool. You need to calculate a parameter reduction model for each VST instrument separatedly.
Hypercube VST currently supports only 64bit VST2 modules. Support of 64bit VST3 instruments is planned in future releases. This software requires 64bit Windows 10 (earlier versions not tested) and 64bit Java (plugin generator user interface).
Youtube video showing how to use Hypercube VST (old version)
I would love to get feedback about this plugin/software.
Eugenics thinks that some genes are clearly better than others but, for example, in game theory there are competitive solutions where the best solution is a probability distribution of different tactics/genes so there are no clearly better genes (hawk-dove example). In society, for example, doctors and judges are both needed so there are no clearly better professions/individuals. One can then say that successful businessmen or athletes who get millions should not get more money than successful doctors, scientists and judges because they all are needed.
Additionally, if an environment changes and population must adapt as fast as possible information theory shows that the fastest way to adapt (in information theoretic sense [has few problems]) to new situation is to calculate using bayesian inference. This means calculations/optimization using population distributions again and the tails of the less probable genes must be supported somehow.
Also, greedy optimization, which often reduces to simple competition based optimization (gradient ascend), gets stuck to local maximas (See the picture). This means the good optimization of genes may require more than relatively simple changes, the competition and the survival of the fittest. To escape from a local maxima, a search through potentially worse solutions is required. So this means that the weaker ones should be supported by the stronger ones. (Note that if we add money to the equation individuals can collect money at local maximum and then do expensive search of new better optimum using money they have. But currently we cannot change our genes using money so genes cannot be changed to search better optimums after resources are collected at the local maxima. (To partly work around this limitation parents support their children to grow to adults so the collected resources at local maxima are used to try new solutions.))
Tomas Ukkonen, M.Sc.