You are here

Better MIDI Pianos

Editing Virtual Piano Performances By Mike Senior
Published August 2023

Better MIDI Pianos

A few tweaks to your MIDI data can make software piano parts much more realistic — and much easier to mix.

Recording an acoustic piano isn’t a task to be taken on lightly, even if you happen to have access to a well‑maintained instrument in a suitable venue, so it’s little surprise that many project‑studio users turn to MIDI‑triggered software instruments instead. In my own mix work I encounter such simulations on a regular basis and, to be honest, the results often leave a lot to be desired in terms of realism and musicality. So in this article I’d like to recommend some tactics for improving your prospects when working this way.

MIDI Humanity

The first thing to say is that it’s both extraordinarily difficult and extremely time‑consuming to program anything but the simplest piano part by clicking the notes in with your mouse, so I’d encourage you to record your piano part as a real‑time performance wherever possible. If your own performance or improvisation chops aren’t up to the demands of the part, then I’d definitely consider enlisting a more expert ivory‑tickler to help you generate the bulk of the MIDI data you need. Failing that, however, there’s still a lot you can do with quite rudimentary keyboard skills just by recording in small sections and overdubbing the left‑ and right‑hand parts separately.

A lot of MIDI controller keyboards seem to favour higher MIDI velocity values, so one of the most common problems I find with the MIDI piano tracks I hear is that they cause the virtual instrument to be played too hard...

Irrespective of how you capture your raw MIDI performance, though, I’ve noticed that a lot of MIDI controller keyboards (especially the cheaper synth‑style unweighted variety) seem to favour higher MIDI velocity values, so one of the most common problems I find with the MIDI piano tracks I hear is that they cause the virtual instrument to be played too hard overall, resulting in a hard, brash tone that refuses to sit nicely in the mix without lashings of remedial upper‑spectrum processing. As such, these days I often request the MIDI data for piano parts in songs I’m asked to mix so that I can turn down the MIDI velocity values en masse. I rarely do this by applying a straightforward velocity offset, however: I prefer to use a global velocity multiplier of between 0.9 and 0.7, thereby reducing the velocity value range as well as the overall values. This is because the velocity handling of some MIDI controller keyboards (and of the people playing them!) can be a bit haphazard, so making the velocity values a bit more consistent usually provides an ancillary benefit. In addition, I’ll almost always turn down any obvious velocity spikes individually to avoid single notes that suddenly leap out of the balance undesirably — intentional musical accents excluded, of course!

Concerns about MIDI velocity values usually run a lot deeper than that in most real‑world scenarios, though. (This stands to reason in a way, seeing as the two main variables of piano playing are when you hit each note and how hard you hit it — indeed, many people consider piano a member of the percussion instrument family for this reason.) If you’ve created your raw MIDI performance piecemeal, for instance, it’s very easy to end up with ungainly velocity transitions between overdubbed sections, or to find that separately performed right‑ and left‑hand parts don’t balance appropriately against each other. This may demand your applying further velocity offsets to smaller MIDI selections, and I think it’s worth attending to those edits before attempting any more detailed velocity contouring.

Velocity Edits

The top screenshot here shows a common problem with many project‑studio MIDI piano parts: the velocity values are simply too high.The top screenshot here shows a common problem with many project‑studio MIDI piano parts: the velocity values are simply too high.

Multiplying all the velocity data by a factor of 0.85 in this case brought the velocities (and therefore the piano timbre) to a more musical‑sounding place, as you can see in the second screenshot.Multiplying all the velocity data by a factor of 0.85 in this case brought the velocities (and therefore the piano timbre) to a more musical‑sounding place, as you can see in the second screenshot.

However, it’s also common for occasional isolated notes in a live‑performed part to be unduly accented, and these are better reined in individually, as shown in the bottom screenshot.However, it’s also common for occasional isolated notes in a live‑performed part to be unduly accented, and these are better reined in individually, as shown in the bottom screenshot.

In terms of editing the velocities of individual notes, there are two general tactics I use to guide me. When it comes to evaluating the musicality and phrasing of piano parts, one of the best guides is simply to sing an approximation of whatever part you’re scrutinising. The mechanics of physically playing a piano part can somehow desensitise the player to awkwardly realised phrasing, whereas singing is a much more instinctive form of expression, so comparing the two is excellent at highlighting the former’s musical deficiences. And you don’t have to be a trained singer, either — this techique will work just as well even if your voice curdles milk at 50 paces!

The second great workflow hack when refining your piano part’s MIDI velocity data is to audition the part within the mix context and refocus your attention from musical concerns to simple issues of mix balance. Can you hear the musically interesting lines well enough? Are the upper registers dominating over the bass notes? Are any unimportant beat subdivisions being over‑stressed at the expense of the rhythmic groove? Or are there certain notes that are causing unflattering frequency build‑ups in combination with other instruments in the mix? This last question deserves special attention, because it’s extremely common for some chord notes (often the root notes of the harmony) to be doubled sporadically by other parts in the arrangement, such that these pitches unduly dominate in the balance. Bear in mind, too, that a lot of less experienced keyboard players have difficulty balancing the power of their thumb against their fingers when playing oscillating or broken chord figurations, so this is another common source of localised MIDI velocity imbalances. From a mix perspective it’s usually a lot better to adjust velocity values to spot‑fix such moments at source, because trying to rebalance things with audio processing plug‑ins downstream can be a bit of a nightmare.

The Quantise Trap

Most live‑performed MIDI piano parts benefit from a certain amount of timing correction, in my view, simply because timing vagaries that pass unnoticed within the context of a one‑time gig can become a bit distracting when heard repeatedly within a recorded production. However, I strongly urge you to avoid the shortcut of applying any kind of large‑scale quantising routine. Let me explain why.

What you need to realise is that the constituent notes within any live‑performed piano chord almost never occur at exactly the same time. In fact, this time‑splaying of chord notes is a crucial and integral part of what makes piano parts sound human, and it’s frequently exaggerated for musical effect in many styles by deliberately staggering the note onsets from the lowest to the highest — creating an expressively ‘spread’ chord.

Better MIDI Pianos

Better MIDI Pianos

Better MIDI Pianos

The top screenshot here shows the MIDI notes of a live‑performed piano chord that’s behind the beat. If you try to remedy this lateness with quantisation, the naturally splayed timing of the notes gets increasingly ironed out as the chord is progressively pulled towards the metric grid — as you can see if you work your way down through the remaining screenshots.The top screenshot here shows the MIDI notes of a live‑performed piano chord that’s behind the beat. If you try to remedy this lateness with quantisation, the naturally splayed timing of the notes gets increasingly ironed out as the chord is progressively pulled towards the metric grid — as you can see if you work your way down through the remaining screenshots.

With this in mind, consider the situation where a quarter‑note piano chord is played, say, a 32nd note late, such that all the naturally time‑splayed chord notes fall behind the quarter‑note grid line in your MIDI editor. If you try to correct the chord timing by quantising towards the grid line, you’ll also ‘squash’ the natural time‑splaying between the chord notes. In other words, the more firmly you quantise in this way, the more you dehumanise the performance. Modern quantisation functions offer plenty of fancy options like variable strength, adjustable quantisation ‘window’, groove templates, and so on, but I’ve yet to find any scheme that doesn’t fall foul of the same essential dehumanisation problem when dealing with piano parts. And don’t expect any kind of MIDI ‘humanise’ function to bail you out here either, because the more or less randomised timing variations those typically generate are a poor substitute for the human time‑splaying that meaningfully responds to the musical context.

So what to do instead of quantising? Well, I hate to have to say it, but I reckon the best way to serve the music is to work manually, grabbing each wayward chord ‘cluster’ in your MIDI editor and sliding the whole thing forward/backward in time without interfering with the timing relationships between the constituent notes. Obviously you need to evaluate the efficacy of any timing adjustments by ear, but I find I can usually get into the right ballpark visually by deciding what the notional ‘centre’ of the chord is, rhythmically speaking, and dragging that to the metric grid. The location of a chord’s rhythmic centre will clearly depend on how it fits into the desired groove in your music, but for me it usually seems to be about a third of the distance between the onsets of the first and last notes in the chord cluster. Clearly, if you encounter some chords that feel rhythmically too indeterminate in a more uptempo production, then slightly reducing the time‑splaying can help tighten things back up.

Pedal Power

Another common shortcoming of project‑studio MIDI piano parts is the sustain pedalling. This is something that non‑pianists sometimes leave out completely, whereas most believable piano parts use the sustain pedal most of the time, even where the player would be perfectly capable of holding all the notes down without its assistance. The MIDI standard uses Continuous Controller 64 for this purpose, with values of 127 and zero corresponding, respectively, to the pedal being pressed down (ie. ‘on’) and released (ie. ‘off’). With real acoustic pianos, there’s actually a certain amount of range to the pedal’s effect between these extremes, and some virtual instruments attempt to simulate this, but I wouldn’t honestly bother with that — I reckon it’s overkill for mainstream productions.

In the Continuous Controller 64 editing lane here you can see a normal sustain‑pedalling data pattern, involving the pedal being released for a moment after each change of harmony.In the Continuous Controller 64 editing lane here you can see a normal sustain‑pedalling data pattern, involving the pedal being released for a moment after each change of harmony.

If you don’t have much experience with using a sustain pedal, then the main thing to understand is the way the pedalling action typically relates to your chord changes. Broadly speaking, the normal scheme of things is that the pedal is pressed down to sustain each of the song’s harmonies in turn, and briefly released to damp any conflicting notes of that harmony from ringing on just as the next harmony sounds, following which the sustain pedal is reapplied to sustain the new harmony. This gives sustain‑pedal data a characteristic look to it during sustained harmonic playing, where the Continuous Controller value stays at 127 most of the time, but then hops down to zero for a moment just after each harmonic change. The critical decisions from a programming perspective, then, are:

  • When the zero region starts: too early and the previous harmony won’t transition smoothly into the new one; too late and the start of the new harmony will clash with the sustain of the previous one.
  • How long the region lasts: too long and the sound may lack a sense of sustain; too short and the previous chord won’t be sufficiently damped, again blurring the harmonic transition.

Piano Pro

You can make a big difference to a MIDI piano part’s musicality and believability by simply addressing velocity values, note timings and pedalling data using the tips laid out above. And if you’d like to hear audio examples of some of the things I’ve been writing about here, then visit https://sosm.ag/better-midi-pianos to have a listen for yourself.  

Choosing & Using A Software Instrument

I’ve talked a lot about improving your MIDI piano parts through programming tweaks, but the MIDI instrument you’re funnelling all that data into constitutes a vital part of the recipe too. It can be tough to choose the right MIDI piano instruments, though, seeing as there are so many contenders available on the market these days, ranging from the bundle presets in hardware keyboard workstations and software samplers, to highly sophisticated virtual instruments dedicated to fastidiously recreating specific piano models. So let me offer some advice here.

Above all, you need to find an instrument that suits the musical style and arrangement contexts you’re typically dealing with. Pianos for classical‑style scoring applications, for instance, will typically have a woodier and more conservative tone with plenty of rich low midrange, and while you might be able to press such an instrument into service (with some judicious 200‑300 Hz EQ scoop) to serve a smooth jazz number or stripped‑back singer‑songwriter ballad, you’ll struggle getting it to fit into any high‑energy pop, rock, or EDM production, where hard‑edged attack and a strong upper midrange tend to be more what the doctor ordered.

A second important priority is to find a patch with a sufficiently smooth velocity response. What I mean by this is that you should check that the timbre of the sound changes in a believable way as you gradually increase the velocity values of notes. Many of the piano patches in big sample‑library bundles suffer from too few samples per note, for instance, with the result that you get obvious and distracting timbral changes as you cross the boundaries between velocity ranges. For example, you might find that velocities 50‑99 use a single sample, so changes in velocity only adjust the volume of the note within that range; but then when you reach a velocity of 100, the piano’s timbre suddenly changes radically, because the sample engine switches over to a much harder‑sounding sample that’s assigned to the 100‑127 velocity range. This kind of thing can make it very difficult to play/program musical‑sounding parts, so I recommend carefully vetting potential instrument choices on this basis.

If you’re stuck with a slimline instrument with bad velocity switching, though, then one passable workaround is to restrict your Velocity values to a limited range to avoid any obvious timbral discontinuities, and then use Continuous Controller 07, Continuous Controller 11, or DAW fader automation to refine the part’s level/phrasing beyond what velocity data editing can manage under the circumstances.

If possible, try to select a software piano that models the natural sympathetic resonances of the acoustic instrument.If possible, try to select a software piano that models the natural sympathetic resonances of the acoustic instrument.

My third big wishlist item for any virtual piano instrument is that it simulates a real acoustic piano’s sympathetic resonance in some way. You see, pressing a piano’s sustain pedal in real life undamps not just the played notes, but all of the other notes too, which enriches the overall tone and also adds a subtle sense of coherence to the sound. What’s more, even with the sustain pedal up, a significant amount of sympathetic resonance still occurs. A software piano that simulates such resonances tends to feel more like a single organic instrument than a bunch of sampled notes, and it’ll also usually provide that additional lovely characteristic tonal ‘bloom’ that acoustic pianos deliver whenever the sustain pedal is pressed down.

Beyond that, there are plenty of instruments that go to inordinate lengths to emulate an instrument’s mechanical noises as well, but to my mind those are mostly just unnecessary gimmicks. It used to be that we’d spend ages in studios trying to avoid capturing those sounds, so we’re not used to hearing them that much within mainstream styles. And if you really do want the indie credentials of a creaky old upright, then frankly you might as well just record one, as that’ll likely give you a much more unique and ear‑catching result than any programmed instrument is capable of.