Scripting Movement

In my twelfth video of the “Building a Game Development Framework” series, I introduce a Scripted Movement System. This allows developers to define a sequence of automated actions for Non-Player Characters (NPCs) or cinematic events without writing manual update logic for every frame.

Before diving into scripting, I made a few essential adjustments to the core framework:

  • Window Resizing: Desktop window sizes are now correctly handled via LibGDX configurations and the resize method, ensuring the game’s internal coordinate system matches the actual window dimensions.
  • Safe Sprite Removal: To avoid “Concurrent Modification Exceptions” (crashes when deleting a sprite while the game is still drawing or updating it), the framework now uses a “To Be Added” list and a removed flag. Sprites are only physically added or removed from the main list between render cycles.
  • Clean Interfaces: Redundant methods were removed from the Movement interface, delegating all positional updates to the SpriteUpdate class.

The scripting system is built around two main components:

  • ScriptMovement: The “brain” that holds a list of actions and executes them one by one. Once an action reports it is “done,” the script automatically transitions to the next one in the list.
  • ScriptAction: The base class for specific behaviors (waiting, moving, turning). It includes logic for Gradual Direction Changes, allowing actors to make smooth arcs rather than “snapping” to a new heading.

I detail several specialized actions that can be chained together:

  • Wait Action: Pauses the actor for a set duration. You can also specify a direction for the actor to face while waiting.
  • Time Movement Action: Moves the actor in a specific direction at a set speed for a fixed amount of time.
  • Change Speed Action: Gradually accelerates or decelerates the actor over time.
  • Destination Action: Moves the actor to a specific (X, Y) coordinate. It uses the Pythagorean theorem to calculate the distance and ensures the actor stops exactly on the target.
  • Easing Action: Provides advanced acceleration/deceleration curves (Linear, Quadratic, Cubic, etc.). This allows for “juicy” movement where an actor starts slow and “eases in” to a high speed.
  • Go-To Action: A logic-based action that jumps to a previous step in the script, allowing for infinite loops or repeating patterns.

I then demonstrate how to combine these actions to create a complex patrol path:

  1. Ease In: The character starts from a standstill and slowly speeds up.
  2. Move to Destination: The character hits a series of four (X, Y) coordinates to walk in a square.
  3. Go-To: After the fourth corner, a GoToAction sends the script back to the second step, creating a permanent walking loop.

This scripting system significantly reduces the complexity of AI behavior, allowing developers to “choreograph” NPCs using a simple list of commands.

Handling Collisions in Video Games

In my eleventh video of the “Building a Game Development Framework” series, I implement Collision Detection and update the Screen Management system. This update allows game objects to interact with each other physically and simplifies how developers add entities to their game world.

I make the foundation of the collision system the BoundingBox class.

  • AABB (Axis-Aligned Bounding Box): The framework uses simple rectangle-to-rectangle intersection checks. It verifies if the X or Y edges of one sprite fall within the span of another.
  • Insets and Offsets: A major improvement I made is adding Insets. Often, a sprite’s image (for example, 64×64 pixels) is much larger than the actual character (e.g., a person is thin, but the image is a square). With Insets this allows you to shrink the physical “hitbox” independently of the visual image so collisions will feel more realistic.

I setup the framework to categorizes collisions based on whether an object is an “Actor” or “Decor”. Since Decors do not move, they don’t check if they collide with another sprite. Actors, which move, do check if they collide with another sprite.

  • Impact Flag: Not all objects need to check for collisions. The doesImpact() method allows the system to skip checks for static background objects (Decor) that don’t move, saving processing power.
  • Reversion Logic: When a collision is detected, the framework “reverts” the character’s movement. In the basic implementation, it simply moves the character back to their previous non-colliding position.
  • Sliding Physics: In more advanced movement (like in 4-key or mouse movement), the framework checks horizontal and vertical collisions separately. If you walk diagonally into a wall, the X-movement might be blocked, but the system allows the Y-movement to continue, creating a “sliding” effect against the wall.

I also introduce the SimpleScreen, which automates the rendering loop:

  • The Sprite List: Instead of manually updating every character, the screen maintains a list of all Sprites.
  • Automated Loop: The renderChild method automatically iterates through the list, calls update(), performs detectCollision(), and then calls draw() for every entity.
  • Separation of Concerns: New abstract methods addActor() and addDecor() provide a clean way for developers to populate their game world during the show() phase.

To make the world feel more alive, I introduce a Random Sound class:

  • Footstep Variance: Instead of a single repetitive sound, this class holds a list of similar sounds (e.g., four different footstep recordings). Every time an animation triggers a sound, it randomly picks one from the list to create a more natural, less “robotic” audio experience.

I conclude with a demonstration of an Actor (the player) navigating around our “Soda Machine” Decor objects. I show how the player is physically blocked by the machines and how the “sliding” logic feels smoother than a hard stop.

This set of features transforms the framework from a simple animation player into a functional game engine where objects have physical presence and sound.

Sound and Music with LibGDX

In my tenth video of the “Building a Game Development Framework” series, I focus on integrating audio capabilities into the LibGDX framework. I cover the distinction between sound effects and music, how to manage their lifecycles, and how to trigger them within an animation system.

The LibGDX framework distinguishes between two types of audio objects based on how they are handled in memory:

  • Sound: Intended for short clips (typically < 10 seconds), such as explosions or footsteps. These are completely loaded into RAM.
  • Music: Intended for longer tracks, like background music or ambient loops. These are streamed from the storage device to save memory.

I demonstrate the primary methods for controlling both types of audio:

  • Playback: Simple .play(), .pause(), .resume(), and .stop() methods. Note that Music does not have a dedicated “resume” method; calling play() on a paused track automatically resumes it from where it left off.
  • Volume & Pitch: Volume is set as a float from 0.0 to 1.0. Pitch (available only for Sound) speeds up or slows down the audio to change its tone—useful for varied sound effects like different-pitched “hellos”.
  • Panning: Moves audio between the left and right speakers (values from -1.0 to 1.0). This only works on mono audio files.
  • Positioning: Specifically for Music, you can jump to a specific second in a track using setDevicePosition().

Because audio objects consume system resources, they must be “disposed” when no longer needed.

  • Automatic Disposal: I have updated the BaseScreen with maps for both Sound and Music objects. When a screen is closed, the framework automatically iterates through these maps and disposes of every audio file to prevent memory leaks.

Then I update my framework to have “Sound-Aware” animations. By adding a Sound field to the BaseAnimation class, animations can now trigger audio automatically:

  • Static/Block Animations: Play a sound once when they first appear on the screen.
  • Loop Animations: Play a sound at the start of every loop cycle (e.g., a footstep sound every time a walk cycle reaches the first frame).
  • Bounce Animations: Trigger sounds at both the beginning and the “apex” (last frame) of the animation, making it ideal for things like bouncing balls or swinging pendulums.
  • Random Animations: Use a counter to ensure a sound plays every time the animation completes a full set of frames, even though the frames themselves are randomized.

I then recommend two browser-based tools for generating 8-bit/16-bit sound effects for game development:

  • SFXR: A classic generator for “retro” sounds.
  • ChipTone: A more modern, user-friendly version of SFXR for creating explosions, jumps, and other game effects.