[id="C4276659642"] .page-layout { align-items: center; } [id="C4276659642"] .page-layout { align-items: center; }


𓆩Psychopomp𓆪

     A movement-driven narrative exploration game

👥4 People
🕗 16 Weeks  
🛠️ Unity, C#, ShaderGraph, Cinemachine, Wwise
🧢Programmer, Producer

[↗]🔗 View Full Psychopomp Repository on GitHub [↗] 📑 View Production Archive & Documentation


Overview


A 3D narrative exploration game built around flight, ritual, and exploration.
You play as Hermes, guiding souls across a procedurally shifting ocean.
Every system, movement, camera, sound, shader, and even the world itself, reacts to the player’s motion.

I was the Gameplay & Systems Programmer and the Producer, responsible for building the movement pipeline, camera logic, reactive shaders, and event-driven architecture. I also led all production sprints (documentation for which can be found above).

This page breaks down how I turned design goals into systems that scale, self-express, and feel good to debug.

The Design Problem


We needed a character controller that:

  • Felt fluid, like a dream but was 100% deterministic.
  • Let every subsystem (camera, sound, shader) react to motion without direct dependencies.
  • Could be tuned without fear of breaking downstream logic.

The solution had to be mathematically consistent but also aesthetically expressive.
In short, the code needed to feel alive.


Technical Systems Breakdown


Psychopomp was engineered around deterministic motion, modular event architecture, and real-time world feedback.
Each subsystem, movement, camera, audio, and shader communicates via GameEvents, eliminating cross-dependencies while staying reactive.

1. Movement Architecture : Precision Over Physics
↗ [View Code File on GitHub]
This replaced Unity’s Rigidbody to ensure complete control over acceleration, glide curves, and timing forgiveness.
⚙️ Core Movement Physics & Surface Handling
// --- Handle grounded movement and air gravity ---
private void HandleGroundChecks() {
    _wasGrounded = _isGrounded;
    _isGrounded = _colUtil.IsGroundedCast(transform.position, out _currentNormal, out _currentGroundTag);
    _isBroadlyGrounded = _colUtil.BroaderIsGroundedCast(transform.position);    if (_isGrounded) {
        _currentCoyote = _coyoteTime;
        _isJumping = false;
        _isGliding = false;
        _currentJumpCount = 0;
        _glideTimer = 0;
        _fallingVelocity = 0;
        vfx.PlaySplashSound();
    }
    else {
        _currentCoyote -= Time.deltaTime;
        vfx.StopSplashSound();
        _fallingVelocity = CalculateFallingVelocity();
    }
}// --- Gravity + Glide interaction ---
private float CalculateFallingVelocity() {
    float gravityFactor = (_fallingVelocity < 0) ? _fallGravityMultiplier : 1f;
    float maxFall = _maxFallSpeed;
    if (_isGliding) {
        gravityFactor = _glideGravityFactor;
        maxFall = _glideMaxFallSpeed;
    }
    return Mathf.Max(_fallingVelocity - (_gravity * gravityFactor * Time.deltaTime), -maxFall);
}// --- Apply ground snapping and ramp correction ---
private Vector3 ApplyFinalMovement() {
    Vector3 moveAttempt = _currentVelocity * Time.deltaTime;
    Vector3 correctedMove = _colUtil.CollideAndSlide(moveAttempt, transform.position, 1, GetRampMultiplier());
    if (_isGrounded) correctedMove = SnapToGround(correctedMove);    transform.position += correctedMove;
    _currentVelocity = correctedMove / Time.deltaTime;
    return correctedMove;
}// --- Dynamic ramp magnitude multiplier ---
private float GetRampMultiplier() {
    float dot = Vector3.Dot(Vector3.up, _currentNormal);
    if (dot < 0.95f && dot > 0.05f)
        return _rampMagnitudeMultiplier;
    return 1f;
}
🪂 Pitch-Based Glide Control ↗ [View Code File on GitHub]
private float CalculateGlidingAcceleration(float velocity) {
    float pitch = Mathf.DeltaAngle(transform.eulerAngles.x, 0f);
    float norm  = Mathf.Clamp(pitch / _maxYZRotationAngle, -1f, 1f);
    float accel = norm > 0 ? _glideAcceleration : _glideDeceleration;    
    return Mathf.Clamp(velocity + accel * norm * Time.deltaTime,_minSpeed, _absoluteMaxSpeed);
}


Pitch controls acceleration: dive to gain speed, pitch up to slow descent.   This transforms math into expression. Movement becomes body language.

⏱️ Jump Buffering & Coyote Time ↗ [View Code File on GitHub]
if (Input.GetButtonDown("Jump"))
    _currentJumpBuffer = jumpBufferTime;
else
    _currentJumpBuffer -= Time.deltaTime;bool canJump = _currentCoyote > 0f && _currentJumpBuffer > 0f;
if (canJump) ExecuteJump();private void ExecuteJump() {
    _isJumping = true;
    GameEvents.Jump();
    _currentCoyote = 0;
    _currentJumpBuffer = 0;
    _currentJumpCount++;
    _fallingVelocity = _jumpVelocity;
}
Small timing windows forgive human latency, making the game feel intuitive and kind, a mechanical version of mercy.

2. Event-Driven Architecture : Shared Language

To decouple systems, I introduced a global static event hub
↗ [View Code File on GitHub]


All systems listen for events instead of polling the controller.
📡 Event Hub Definition
public static class GameEvents {
    public static event Action OnJump;
    public static event Action OnLand;
    public static event Action OnStartGlide;
    public static event Action OnStopGlide;
    public static event Action OnTierChanged;    public static void Jump() => OnJump?.Invoke();
    public static void Land() => OnLand?.Invoke();
    public static void StartGlide() => OnStartGlide?.Invoke();
    public static void StopGlide() => OnStopGlide?.Invoke();
    public static void TierChanged(int tier) => OnTierChanged?.Invoke(tier);
}
🎮 Subsystem Listeners ↗ [View Code Folder on GitHub]
// CameraController.cs
void OnEnable() {
    GameEvents.OnStartGlide += () => SwitchToCamera(_glideCameras[_currentTier]);
    GameEvents.OnTierChanged += UpdateNoiseCurves;
}// AudioManager.cs
void OnEnable() {
    GameEvents.OnStartGlide += SwitchToFlyingMusic;
    GameEvents.OnStopGlide  += SwitchToAmbientMusic;
    GameEvents.OnJump       += PlayShockAudio;
}

This architecture turned gameplay programming into modular composition : scalable, production-safe, and easy to debug.

3. Shader Integration : Ocean as System
The ocean shader reacts to flight speed and player movement, built in Shader Graph but driven by C#.   By exposing parameters globally, GPU-side visuals respond instantly without frame delay.

🌊 Shader Parameter Sync
Shader.SetGlobalFloat("_WindSpeed", _speed / _absoluteMaxSpeed);
Shader.SetGlobalVector("_PlayerPosition", transform.position);
💠 Technical Breakdown
1. Dual Normal Blending:
   Two scrolling normal maps offset by wind speed.2. Depth-Based Color:
   SceneDepth * FarPlane → Lerp(shallow, deep)3. Vertex Displacement:
   GradientNoise(worldPos) × amplitude4. Reflection Falloff:
   View angle × Normal strength.5. Wind Sync:
   _WindSpeed driven by player velocity.

These systems made the world breathe with the player. Physics, math, and art moving in one heartbeat.

4. Production Results
MetricBeforeAfter
Controller latency6–8 ms variance< 2 ms deterministic
Cross-system bugs~12 per sprint< 4 per sprint
Setup time for new reactive feature~30 min~15 min



thanks for visiting
made with 🩷and ☕ on a beautiful June afternoon🌻