UP DOWN READY

Hi there! You've probably reached this page from the Freeplay website in search of the elusive 'Up Down Ready' which took Best Design in this year's awards.

UPDATE: Hello! If you've arrived here from StumbleUpon, why not go play the game now on Kongregate! With high scores and EVERYTHING.

The versions of the game that were on this page were hosted on Dropbox. Since the recent spike in traffic, Dropbox has cut off access to my public folder. I will hopefully be migrating this stuff across to an actual server, but the version on Kongregate is the official release.

I'm Delia, the Sword Lady half of Sword Lady and the Viking. You can find my teammate over here. We are third year students in the Games Design course at Griffith University, and Up Down Ready (previously known as Horse) is our first semester project from this year, made using Adam Atomic's free Flixel library.

We entered it in the Freeplay Awards for a lark, and were incredibly surprised and excited to make it to the finals and take away an award. We hope you enjoy playing the game as much as we enjoyed making it!

- Delia
Showing posts with label individual component. Show all posts
Showing posts with label individual component. Show all posts

Sunday, April 25, 2010

Bang! (Clay pigeons are fuckers)

The first build of the Sound Effects Handler for Unity is go!

It is a single invisible GameObject that sits quietly in your scene until you invoke one of its public static methods from another script, whereupon it makes some kind of noise. It is dead simple, but it took a while for my thinking to progress to the stage where I could make it so.

The basic premise of the handler is that it centralises all of the sound effects contained in the scene.

As far as I know, Unity is designed to handle sound by attaching to objects AudioPlayer components, the functions of which are then called from that object's script. While this approach has its uses, it is one that I can only imagine would be immensely frustrating for a sound designer who had no contact with the game other than to provide sound effects.

Funnily enough, this is the role that I am taking on with all of the class' projects bar Horse. I decided that I wanted an interface that allows me to put all the sound effects into a scene and then say to my fellow developers this:

"Here are your sounds. This is what they are called. Here are some simple lines of code to make them play or stop, which you can call from any other script in the scene. Have fun, please try to leave my code alone, and call me over if you have problems."

And this is exactly what I plan to do.

At a glance, the SoundSystem prefab I have built looks like this:



The script property called "Sound Clips Input" is a public array where you put all your sound clips.


You may notice that some of them have pretty weird convoluted names that will be impossible to remember when you want to call one. The solution is in that Text File property of the script.

The Text File links up the names you want to attach to the sound with their initial file names. At the moment, it has to be set up by hand, but for the scope of these projects, it won't take long enough to be an issue.

In my example project, the contents of the text file look a little something like this:

blues=13-Gritty Harp 1
rocket=18380__inferno__hvrl
bang=65731__Robinhood76__00759_explode_2_distant

On the left, the new name for the sound. On the right, separated by an '=' symbol, the original file name. This gets read and set up inside the SoundSystem script when the scene starts. Simple, no? (As testing progresses this week, I will put some thought into if there are more efficient ways to manage this.)

The Sound Object Input property contains a template "Sound Object", which is a GameObject containing a blank AudioSource. This is what I use to create and play what I call 'state-based sounds' - those that loop continuously when an object is in a certain state e.g. character footsteps. This template shouldn't really be editable - I suspect there is a way to link up script properties to assets in code rather than the inspector, which I will look into in a later iteration.

At the moment, the Sound System script contains three public static functions with which developers need be concerned. They can be called from any script in the game. It does not require an explicit reference to the SoundSystem object within each script. Just use this syntax:

SoundSystem.functionName(parameterA, parameterB);

These functions are as follows:

playOnce(string Name, GameObject Caller)

This function plays the sound with name Name once at the position of the GameObject Caller, and then immediately destroys the AudioSource and cleans it up. (For those interested, it uses Unity's built-in playClipAtPoint function).

The Name should be the name of the sound as specified in the text file, and the GameObject is the object at which you want the sound to occur. Usually this will be the object the script is attached to, which would be written as this.gameObject.

startLooping(string Name, GameObject Caller)

This starts the sound with name Name looping continuously at the position of the GameObject Caller.

It does this by instantiating the Sound Object Template prefab at the position of the Caller object and changing the blank AudioSource clip to the clip called Name. This new object is then parented to the Caller object, so that if the caller object changes position, the sound effect will pan and fade accordingly.

If there is already a Sound Object attached to the Caller GameObject, the script will simply call Play() on the AudioSource component of this object, rather than creating a new object.

stopLooping(string Name, GameObject Caller)

This finds the looping sound with name Name at the GameObject Caller and stops it playing.

If this function is called from an object that has no such sound playing, the Sound System will print an error to the console.

So, that's the basics of the Sound System.

I have also collated a bunch of placeholder sound effects to be used in most of the games, so I will be implementing and testing the system with the rest of the class this week. I have a list of things I would like to iterate on as I progress through the rest of semester, and I am sure my fellow developers will have useful things to add to it.

Wednesday, April 7, 2010

Make some noise about it

I have been working on a variety of things this week (Horse updates will hopefully occur in the next day or so), but most importantly I have been working on my solo deal: namely, making music.

So, for those of you with no attention spans who require images, this is what I have been working on:



Part of the music for Bullet, so far. This was based on my brief of 'epic strings' a la Hans Zimmer's work in The Dark Knight. It's scored for string orchestra plus timpani, bass and snare drum, and to be honest, it came out kinda Philip Glass (Philip Glass, Philip Glass, Philip Glass /music joke).

You can listen to a first musical 'sketch' of themes and ideas for Bullet here.

Bear in mind that the samples are the default output from Sibelius (the notation program, not the composer), and is kind of dodgy sounding. This will not be the final sound - it's just to give an idea of the harmonic structure and the kinds of motifs I want to layer in the music.

I'm fond of that triplet figure, even though it would be kind of horrible to play. Fortunately this won't be getting recorded, so I can fudge the samples as much as I like in Pro Tools. But in any case, Tyson and I have agreed that this first attempt is too dark for the game in question - Tyson pointed out that he felt like he should be killing kittens if this was in the background while he played. The next step, from where I'm standing, is to get rid of some of the heavier scoring in the bottom end and adjust the chord structure underscoring that top motif in the violins so that it's a little bit less edgy but still has a lot of depth.

Waiting to hear from the rest of the team before I make a move on this, though.

Next on the list was Islands:


Musical 'sketch' for Islands is here.

There's not as much stuff here, because it was really really easy for me to nail down the main motif. Going for that floaty Jo Hisaishi piano sound, a la Spirited Away. It accidentally ended up moving into 5/4 after those three introductory bars when I started noodling around with the chords on the piano, and it works really well because the extra beat gives it that slight lilt that emphasises the sense of air and flight.

I'll be eventually writing modulations, chord changes and variations on the theme (chords need work. I am not a pianist and I don't do interesting ones naturally), as well as instrumental parts to bring in and out, but this main motif will always be at the heart of it. This one I do want to record the parts for. On top of the piano, I want to write at least a sax part and a flute part, and maybe a guitar part. I am now wondering if I have any favours to call in from, say, horn players or oboists...

Anyway, I am still waiting to hear back from the team on this one.

My sound system is also making some progress, but I want to go back to focussing on implementing some more things in Horse, and writing music for it. I'd like to have a reasonable version of it ready for Supanova this weekend, so tomorrow is a Horse day.

Tuesday, March 30, 2010

Target Practise

Current vaguely scheduled list of targets/goals I'd like to hit this semester. Subject to frequent updates and adjustments. Delivery times are 'as at end of week x'.

Week 6
Primary: Sound design breakdowns complete
Placeholder SFX implemented in projects
Secondary:
Story treatment roughs completed

Week 7
Primary:
Initial rough music concepts
Secondary:
Design documentation completed, if required by teams

Week 8
Primary:
Sound system initial version complete

Week 9
Primary:
Music concepts complete and locked in

Week 10
Primary:
Majority of SFX recorded/produced and in place

Week 11
Primary:
Sound system final version complete

Week 12
Primary:
Music composition and production complete

Going solo, update

Here is a clearer list of what, in an ideal world, I would like to achieve by the end of this semester (some of these, of course, may get extended to whole-year goals).

Mostly copied and pasted from my individual contract for this semester, but it helps me to put it up here.

Primary Goals

Sound Design

I will write sound design breakdowns for at least 3 – 4 projects (initially I am thinking for Horse, Bullet, Lantern and Islands), if not all of them. This includes lists of sound effects required, brief documents discussing the choices made in the sound design and the intended impact of sound and music on the player.

Sound Programming

I will build and maintain a sound system for use in Unity projects to streamline the inclusion of sound in the game. This is dependent on the nature of each game, whether the sound needs to be simulated in 3D space or not, and the needs of the separate teams.

I am still unclear as to the exact nature of this system. I am hoping that over the next week with some more feedback I will be able to define the problem better and start building a solution.

Music Composition/Effects Production

I will be composing most, if not all, of the music for Horse. This will include different tunes with various arrangements and styles for different stages of the game. While initially the music should have an 8-bit feel to match the retro look of the game, it should change frequently and drastically, in accordance with the subversive nature of the game design.

I have also been asked to compose the music for Bullet. The current music direction suggested is something on the lines of 'epic strings'.

I am keen to compose music for Islands, as this would allow me to explore a slightly different realm of composition – rather than music that is based specifically on the visuals or the theme of the game, Islands requires music that evokes a specific emotional reaction in the player.

Over the course of the projects, I will build up a library of sound effects based on the sound design for each game. Some of these may be generated using synths or tools such as SFXR, others will be Foley and general effects recorded at my home studio or sourced from various free sound effects libraries.

Secondary Goals

Story treatments

Subject to the desires of the core teams, I will be assisting with the development of narrative contexts for the Lantern and Islands projects. If this goes ahead, I will be writing story treatments for both games that form part of the overall design and inform the use of the gameplay mechanics involved.

Design Documentation

This is less of an individual focus and more of an element of the overall group design work. I am including it here because I have a specific interest in synthesising design ideas into a clear written format that effectively communicates the core concept of a project. I would like to use my writing abilities to help teams bring their ideas together into a cohesive whole that can be used for reference during development.

I will put my scheduling goals in a separate post, so that I don't get tl;dr problems when I need to refer back to them.

Sunday, March 21, 2010

Going solo

Hey there cats and kittens. This is your recently-appointed resident sound guy talking (okay, sound girl, technically, but that sounds naff and wishy-washy).

Finally getting around to blogging about my individual role for this semester's projects, so that all of you (my current developers-in-arms) what's going on, and what you can ask me for if you need it.

The way I am currently thinking about it, the role of sound in these projects can be broken down into three areas:

1. Design

The decision-making process. This is really key to the what, if any, impact sound has on the way the player plays the game, and the player's impression of the game.

Off the top of my head, this is the sort of stuff I can work with people on, if you guys want my input:

  • deciding where and if music or sound effects should be used
  • picking and choosing the right kinds of sounds and the right places to use them
  • what will they signify to the player?
  • do they serve as feedback or stimulus?
  • will the player be able to read them correctly?

2. Sound programming

I would really like to build some kind of simple system that everyone can use to easily organise and trigger sounds and music for their games (probably in Unity, since the majority of projects will be happening in that).

I haven't had a chance to think too deeply about this yet, and I would welcome feedback on the kind of system people would like in order to help them streamline the audio pipeline. How would you guys like to be able to handle sound in your scripting?

3. Composition/Recording

This one is slightly trickier. I am a little rusty in this area, and I am not sure how quickly I will be able to produce music and/or effects. At this stage, I'm going to say tentatively that I will do the music for two projects this semester. These are actually already pretty much set - I'd like to compose for Horse, because there is the potential for a lot of fun and silly chiptuning there, and Tyson has already lined me up to work on their Bullet game, so unless the projects get especially shuffled around these two are pretty much locked in.

That said, if people want me to record Foley or create SFX (or even come over to my place and do their own while I record), I am happy for this to be arranged.

Do let me know if you want this facility available to you as an option, because I need an excuse to go get a new mic + monitoring headphones, and I will need time to sort that out.

So, in a nutshell, get in touch if you want soundie stuff, kids.