Grant Skinner

The "g" in gskinner. Also the "skinner".

@gskinner

MLB.com onBase Project in Times Square

We recently had the pleasure of building the MLB.com OnBase AIR application, which lets you keep track of everything that’s going on with your favourite MLB teams and players via Twitter, official notifications, and game updates. Working with MLB on the project was very cool by itself, but as a cherry on top it is now being advertised in Times Square. It’s pretty sweet to see your work 50 feet tall in Times Square – it’s a little like being a super nerdy and completely unknown broadway star.

I’m very proud of all the awesome work my team did on this one. Consolidating all of the data feeds necessary to power this app in an effective and efficient manner was quite a challenge, and they did a great job of making it happen.

You can get more info on MLB.com OnBase at mlb.com/onbase.

Here’s a picture of the digital billboard in Times Square:

cacheAsBitmapOptions

I just filed a feature request for Flash player, asking for a cacheAsBitmapOptions property on display objects. The intent behind this is to make the current cacheAsBitmap functionality more useful. Currently, when you use cacheAsBitmap, the cache is redrawn any time you modify the transform the display object in any way other than translation (moving it in x/y). For example, if you rotate, alpha, or scale your display object it forces the cache to redraw.
It’s often useful to maintain a cache through other transformations, such as rotation, changes to alpha, and scaling. I frequently implement this functionality manually by drawing the display object to a BitmapData, and swapping the Bitmap for the original Sprite. This works, but it’s very clumsy, and requires a lot of manual management.
In conjunction with this, it would be useful to specify the resolution that the bitmap cache is created at, as a scale multiplier of the DO on stage.
The cacheAsBitmapOptions feature I am proposing would let you programmatically specify what transformations would result in a cache redraw. For example:

import flash.display.CacheAsBitmapOptions; // name based on NetStreamPlayOptions
var myOpts:CacheAsBitmapOptions = new CacheAsBitmapOptions();
myOpts.updateOnRotation = false;
myOpts.updateOnAlpha = false;
myOpts.updateOnScale = false;
myOpts.cacheScale = 2;
characterSprite.cacheAsBitmapOptions = myOpts; // passing null would revert to defaults.
otherSprite.cacheAsBitmapOptions = new CacheAsBitmapOptions(2, false, false, false);

If you think this functionality would be useful for your projects, feel free to vote it up. The bug is filed in the Flash Player bugbase as bug #FP-2524.
On Monday I will post a blog entry walking through how to implement this type of functionality manually, with a handy helper class.

Flash / ActionScript Developer Qualifications

Steve Weiss from O’Reilly emailed me to ask what skills Flash and ActionScript Developers require, beyond the obvious, to progress with their work and careers. I tossed together a quick list, which wound up being longer than I expected, so I thought I’d share my response:


While I don’t think it’s an industry norm yet, I consider ActionScript Developer and Flash Developer to be synonymous. Flex Developer is not – it implies a knowledge of MXML that is not necessary to be a pure AS or Flash Developer.

Skills I would expect any experienced Flash Developer to have include:

  • ActionScript development (obviously)

  • OOP experience

  • Some experience with architecture / design patterns

  • Code standards

  • Data services integration: XML, JSON, SOAP, etc.

  • Problem solving

  • Debugging

  • Optimization, both code and graphics

  • Quality testing

  • Reasonable understanding of UX and interaction design

  • Basic graphic design and motion graphics skills (enough to understand and implement designer concepts)

  • Graphics import

  • Basic skills with Photoshop, Illustrator, and Fireworks (for tweaks / exporting art)

  • Basic understanding of video / audio compression

  • Programmatic motion

  • Writing (for team / client communication, documentation, comments, etc)

  • Verbal communication and interpersonal skills for team / client interaction

  • Experience working with Flash and the timeline

  • Experience with an external code editor (ex. FlexBuilder, FDT, FlashDevelop)

  • High level understanding of Flex, FMS, Remoting, FlashLite, AIR and other core Flash platform products / libraries

  • High level understanding of server development and databases (not necessary an ability to implement anything, but a small amount of experience / understanding of the models)

  • Basic understanding of HTML, JS, CSS

  • Integration with HTML, JS

  • Community awareness (online resources, frameworks, etc)

  • Basic math – understanding and combining +, -, /, *, %, exponents and basic trigonometry (sin, cos, atan2, etc)


This is not necessarily a comprehensive list, and it definitely has a lot of overlap with my “Things Every Flash Develop Should Know” talk, but I thought it would be worth sharing so that junior developers had a rough guide of valuable skills.

Did I miss something important? Disagree with a point? Leave a comment.

Evolution of an Experiment: Circle Collision part 3

In building the circular collision music visualizer which I blogged about yesterday, I became mildly obsessed with the lighting effects on the circles, which led to another short set of experiments.

My first step was to simplify the music visualizer, so that I could focus just on the lighting. I wanted the lighting to be dynamic though, so I kept the light intensity keyed to music. I played around with a lot of different ideas for lighting effects, finally settling on a combination of a bevel filter, an inner drop shadow, and a stretched / rotated / blurred shadow sprite – all of which are affected differently by the intensity and distance to the light. I also added a radial gradient behind the light to make it more dramatic. The result was LightTest1.

I also built a slight variation, LightTest1a, that modifies the intensity of the light based on its proximity to the cursor, so the user can control the brightness.

Finally (thus far), I made the circles move based on the volume of the music, which created a somewhat hectic but mildly interesting visualizer in LightTest2.

I’m not quite sure where I’m going to go with this from here. I’ll release some of the source code next week. I’m also planning to do a similar article(s) covering my recent sphere experiments at some point.

Evolution of an Experiment: Circle Collision part 2

Continuing from yesterdays article showing the beginning of my circular collision experiments, this post will show how the simple test cases evolved into a music visualizer.

Yesterday on Twitter SeyelentEco asked me why I built a circular collision engine instead of using an existing physics library. The main reasons were size, speed, and simply because I wanted to learn a bit about the math involved. For example, all of these experiments are under 5kb, and the collision logic can handle a couple hundred circles without too much trouble.

As part of the initial commercial project that birthed these experiments, I built out a version in which circles would scale up quickly when the user rolled over them. Unfortunately, I don’t seem to have this version any more, but I thought it was pretty cool to roll over a circle and watch the surrounding circles get propelled outwards. Rolling over/out of a circle rapidly sparked the idea of having two “speaker” circles that would expand based on the volume of a playing track.

To build this, I realized I would need support for anchored circles (ones that didn’t move). This led to CircleTest4 which implemented this feature. Click and drag to create circles, move the cursor before you release to give them velocity. It’s mildly entertaining to see how many bounces between the circles you can get.

Next I plugged in the music, wired each channel’s volume to a “driver”, and started playing with the aesthetics, resulting in CircleTest5.

From there, it largely became an exercise in design and motion graphics. I played with the colors, added trails using the drawing API and bitmapdata, and continued tweaking the collision logic to be more reliable in high speed collisions. The result was CircleTest6.

I ran through a few more design iterations (which I won’t link because they are pretty minor): CircleTest7 had the drivers orbit each other at close proximity and added the subtle bevel effect to make them look like metaballs, CircleTest8 was a variation playing with the circles moving farther apart based on volume (which I didn’t like), and CircleTest9 played with adding color.

Ultimately, all this tweaking resulted in the current final product, CircleTest10, which combined the metaballs, color, and some basic lighting effects. If you click and hold, you’ll note some vestigial functionality from earlier experiments: the ability to create new red circles.

In part 3 I will show how CircleTest10 evolved into a short series of experiments on lighting effects. I will also release at least some of this code once I have a chance to clean it up a tad.

Evolution of an Experiment: Circle Collision part 1

I thought it might be interesting to share the progression of a couple of my more recent experiments, so people could see how a simple initial concept slowly evolves into something more complex and polished looking.

I’m going to start out with my “circle collision” experiments. These were actually sparked by a client project, which has itself significantly evolved into something much different. However, suffice it to say that the original commercial project required circular collision logic and gravity.

I started with a very simple test case demonstrating gravity and circular collision. In order to get the specific result I wanted I used the square root of distance for gravity instead of the square. You can check out CircleTest1 here. Click and hold to create new circles. They will simply bunch together in the center.

Still focused on the client project, I worked to refine the collision logic to make it more precise, and started playing with ways to bring the circles on screen. CircleTest2 demonstrates one of these ideas: launching the circles from the edges of the screen at a roughly right angle to the edge.

At this point, I started to think beyond the client’s requirements, and decided to add mass-based collision physics (which wasn’t needed in the commercial project). Initially I just wrote the collision and physics from scratch, but in later experiments tweaked the logic with information I found online. To test them, I enhanced the UI slightly so that I could create manually create circles with different sizes/masses and initial velocities. You can check out CircleTest3 here. Click and hold to create circles. Move your mouse before you release to change the initial velocity of the circle.

The next installment of this 3 (?) part series will look at the process of turning these initial experiments into a music visualizer (note: music will take awhile to load).

If people want, I will likely release at least some of this code. It’s pretty hacky, but maybe someone will find it useful. Also, if people think this series is interesting, I will also do one on the spheres experiments I’ve been playing with.

Fanbase Goes Multi-Screen: Desktop, CD, iPhone

Atlantic Records recently announced the launch of two projects that we’ve been working on with them: Fanbase for iPhone, and the first Fanbase Connected Album.

We’ve been working with Atlantic for a little over a year now, originally on Fanbase Desktop. Fanbase is an AIR application, developed with Flex and Flash that connects music fans with each other and their favourite artists through chat, forums, news, photos, videos, music, events, and more. It’s been very successful, with 17 artists using it to date to communicate with thousands of fans, and many more coming. In order to facilitate this growth, we also built a custom AIR application to allow Atlantic’s designers to visually configure and skin the application for different artists. This generates an artist package, which can be installed into the shared Fanbase application via a custom AIR install badge.

Since it’s launch, Fanbase Desktop has won a number of awards, including the “NARM Outstanding Achievement Business Innovation Award“. Adobe did a customer success story on it here.

Until recently, Fanbase Desktop was only available to download online. Now, a special version that we built with Atlantic is also available on the new Rob Thomas album (CD), with bonus content and features. Pretty cool (and a little nerve wracking) to see your work on physical media that’s being mass-distributed. It’s likely there will be other CD versions for other artists in the future.

Perhaps most exciting for me, is the new iPhone version of Fanbase (available in the US iTunes store). This is our first publicly available iPhone application, and I’m very proud of the end result. We have been ramping up our iPhone capability internally for a while, but I’ve been waiting for a really great first project, and this was definitely it. I’m also really happy with the story it lets us tell, about taking a single property to multiple unrelated devices. I’m still waiting for some of the awesome new features of FlashLite and Flash on TV to become a reality so that its even easier to refactor and redeploy an experience to multiple devices.

A big thanks to Eric at Atlantic records for trusting this project to a previously untested iPhone development group! Also a huge thanks to my amazing team for pulling the extra hours to deliver such a solid end product on an insane timeline.

At the risk of ending on a sales pitch, gskinner.com is now officially accepting select iPhone development projects. We are particularly interested in innovative projects that also involve desktop or web integration.

Source Code: Spectrum Ring Music Visualizer

I had someone request the source code for my spectrum ring music visualizer, and I figured I may as well just release it here for anyone who wants it.

Update: Note that it will sometimes throw runtime errors in FP10. I haven’t had a chance to look into it, but I’m quite certain it never threw those errors in FP9. I’ll try to fix it when I have a few minutes.

If I remember correctly, it was my first attempt at playing with computeSpectrum and ByteArrays, so the code is likely a hacky mess, but hopefully it’s useful for someone. At the least, it might be fun to take a look at how the visual effects were achieved, and play with the numbers.

You can download the source here. Just drop an MP3 named “music.mp3” in the same directory and it should load and play it. All the code is in the AS file (the FLA is empty), so it should work as an ActionScript project in FlexBuilder.