The function \(f(z) = \frac{e^z}{1 + e^z}\) is called the inverse-logit function and maps any real number \(X_i \beta\) into a probability between 0 and 1.
The softmax generalizes the inverse-logit–it takes \(J\) real input (i.e., from \(\mathbb{R}^J\) and scales them to sum to one.
\[
\text{softmax}(z_j) = \frac{\exp(z_j)}{\sum_{k=1}^{J} \exp(z_k)}, \qquad j = 1,\ldots,J.
\] Because the \(J\) outputs must sum to one, it is natural to interpret the outputs as probabilities.
The OJS widget below allows you to experiment with \(J = 4\) inputs and see how the outputted probabilities change.
viewof z1 = Inputs.range([-5,5], {step:0.1,value:1,label:"z1"});viewof z2 = Inputs.range([-5,5], {step:0.1,value:0.5,label:"z2"});viewof z3 = Inputs.range([-5,5], {step:0.1,value:-0.5,label:"z3"});viewof z4 = Inputs.range([-5,5], {step:0.1,value:-1,label:"z4"});// Softmax pieceslabels = ["Pr(A)","Pr(B)","Pr(C)","Pr(D)"];zs = [z1, z2, z3, z4];exps = zs.map(Math.exp);sumexp = exps.reduce((a, b) => a + b,0);ps = exps.map(e => e / sumexp);// Data for stacked bar (one row, four segments)data = ps.map((p, i) => ({ label: labels[i],value: p,row:"softmax" }));// === Two-column layout: sliders (left) + live table (right) ==={const container =html`<div style="display:flex; gap:16px; align-items:flex-start;"></div>`;// Left column: slidersconst left =html`<div style="display:grid; gap:8px; min-width:220px;"></div>`;left.append(viewof z1, viewof z2, viewof z3, viewof z4);// Right column: tableconst rows = labels.map((cls, i) => ({Class: cls,z: zs[i],"exp(z)": exps[i],Probability: ps[i]}));const right =html`<div style="min-width:260px;"></div>`;right.append(Inputs.table(rows, {columns: ["Class","z","exp(z)","Probability"],format: {z: d => d.toFixed(2),"exp(z)": d => d.toFixed(3),Probability: d => (d *100).toFixed(1) +"%"}}));container.append(left, right);return container;}