GeGLU paper
Thread (23 posts)
Can you describe the geometry of the GeGLU activation function?
Thinking Summary
**Describing GeGLU Geometry** I'm focusing on the geometry of the GeGLU activation function. It involves the product of a GELU-transformed gate and a linear value. In a scalar case, I can express it as f(a, b) = GELU(a) * b, which helps visualize it in two dimensions. In terms of gradients, the gating creates an oblique hyperplane, smoothly transitioning between 0 and 1. I think it’s helpful to consider this piecewise and note how the GEGLU behaves like a cubic on the negative side near zero, resulting in a geometric shape that's not entirely symmetrical. **Defining GeGLU Geometry** I'm planning to provide a clear definition and illustrate the shape in 2D, focusing on the gating dimension versus the scaling dimension. I'll also discuss the gradient vector field and explain the iso-lines, which resemble radial "sausages" oriented along the value dimension. Additionally, I want to highlight the differentiation between the gating axis, referred to as the "gate dimension," and the "value dimension." It's important to deliver this in an organized way that clearly conveys the concepts.
Geometry of a single-neuron GeGLU
———————————————————————
(Write , with the gate component and the value component that will ultimately flow forward.)
Scalar form:
-
Domain & codomain
• Domain: the 2-D plane .
• Codomain: the real line (for one neuron) or, in the vector case, the -dimensional subspace spanned by the “value” projection.
Geometrically we are mapping each point of the plane to a signed height above/below it. -
Separating axes
• The -axis controls a smooth gate, ranging continuously from 0 (far negative) through 0.5 (at the origin) toward 1 (large positive).
• The -axis is passed through unchanged and merely gets scaled by that gate.
Thus the plane is foliated into vertical lines (constant ) that are uniformly compressed or expanded. -
Surface shape
Plotting gives a saddle-like surface that is:• Flat along the entire axis at (because the value is zero).
• As increases the surface “fans out’’ in the direction, but its amplitude is modulated by the bell-shaped factor (Fig. below).For fixed the cross-section is exactly the 1-D GELU curve scaled by ; for fixed it is a straight line through the origin with slope .
-
Level sets (iso-contours)
Solving gives
whenever .
These are rational curves that resemble hyperbolas but whose arms pinch smoothly toward the region where the gate shrinks to zero. No contour ever crosses the negative axis because the output there is arbitrarily close to 0. -
Gradient field
• Along the direction gradients are constant w.r.t. : each vertical line has the same slope equal to .
• Gradients vanish as (saturation), peak around , and decay again for very large due to the factor plateauing at .
Vectorised GeGLU inside a Transformer FFN
——————————————————————————————
Let the input vector be linearly projected into two halves:
(gate)
(value)
and produce . (Hadamard product.)
Geometrically:
- The value subspace spanned by columns of forms a -dimensional affine subspace of the model’s representation space.
- The gate vector acts as a smooth, element-wise scaling mask that compresses coordinates of that subspace toward the origin depending on how “positive’’ the companion coordinates in are.
- Consequently, the reachable set of is a warped, axis-aligned “cone’’ inside that subspace whose opening width varies smoothly with . No new directions are introduced; directions are re-weighted.
Comparisons & intuition
——————————————
ReLU gate: sharp half-space clipping, output lies on two flat half-cones separated by a hard plane .
GELU gate: replaces that plane by a sigmoidal transition band, bending the cone walls smoothly. The resulting surface is everywhere differentiable, giving GeGLU its smoother optimization landscape while still preserving a gating geometry (outputs shrink continuously to zero as any gate coordinate drifts negative).
Bottom line
—————
GeGLU carves the plane into smoothly-gated linear pencils: each “value’’ axis is a straight line whose amplitude envelope is the soft bell curve of the GELU applied to its paired “gate’’ axis. In higher dimension, this becomes an axis-aligned, differentiable gating cone embedded in the FFN’s hidden space.