In today’s guest post, Yuri Lowenthal (who voiced the Prince in 2003’s Prince of Persia: The Sands of Time) talks about the special challenges of voice acting, as opposed to acting on camera.
When Yuri, Joanna Wasick and I came together in a sound studio for the first day of voice recording on POP:SOT, we didn’t have animations, animatics, or even concept art yet. While the POP team was bringing the world and characters of the game to life on screen, two actors first needed to make them real in their imaginations. The Prince and Farah began as voices in darkness.
I cherish voice recording as a special, thrilling, and terrifying moment in game production. Having experienced it from a writer-director’s point of view, I asked Yuri for an actor’s perspective on the process.
Yuri Lowenthal is an actor who lives and works in Los Angeles. You may have heard/seen him in Prince of Persia: The Sands of Time, Afro Samurai, Terminator: The Sarah Connor Chronicles, and Ben 10. He is married to actress Tara Platt and easily stalked at @YuriLowenthal. And if you’re nice he’ll tell you the exciting story about the time he met Jake Gyllenhaal.
People often ask me: “What’s harder? Voice acting or real acting?” I’ve heard it so many times that I hardly get offended anymore. Almost hardly. I mean, I get it; the person speaking is really trying to say: “What kind of acting is more difficult, the kind where we just end up hearing your voice, or the kind where we end up seeing your face?”
Well, let’s break it down:
For on-camera acting, I generally get the script in advance, time to talk with the director about the character and what his or her vision is for the project, maybe do a little research, put on a costume, work with some props, walk around the set, rehearse with other actors, and take time to break down the script so that I can bring you, the viewer, the best performance I am capable of.
For voice acting, I generally show up the morning of the recording, am handed a script, and after about 5 minutes (if I’m lucky) of discussion with the director (or sometimes writer) about the project, we get down to business so that I can bring you, the viewer/listener/gamer, the best performance I am capable of. Will my performance be judged less harshly because I didn’t have the niceties that an on-camera or theatrical situation can afford? Absolutely not. Continue Reading
Today’s guest post comes from KlickTock founder Matthew Hall, creator of Doodle Find and Little Things.
I can identify with Matt’s feeling that he came to the industry too late — that the “golden age of the bedroom coder” had passed him by. That’s exactly how I felt in 1982, when I’d had my Apple II for four years — since age 14 — and still hadn’t managed to get a game published. While other programmers produced hits like Space Eggs and Alien Rain, I could feel the window of opportunity closing, and kicked myself for having taken so long to get my act together.
As Matt and I can both attest, the brass ring comes around more than once.
I met Jordan at GDC earlier this year. I’d recently attended his postmortem of Prince of Persia and ran into him in the halls. We talked about developing games at that time and our own game development histories. However, given Jordan is quite famous and you probably have never heard of me before — what went wrong?
I am only a few years younger than Jordan. Just like he received his first computer, an Apple II in 1978, I received my Commodore 64 in 1983. I programmed games throughout my childhood, but by the time I was able to produce a professional quality game — the golden age of the bedroom coder was over. My 8-bit heroes had moved onto 16-bit and found themselves struggling. The industry had passed to the hands of those with big cheques and bigger teams.
Instead of producing a hit title in my bedroom — as I was always hoping to — I developed homebrew titles for the newly released Game Boy Advance. Nintendo would never allow garage developers like myself access to their development kits, so I used one of the many “flash-kit” solutions available on the black market. As an unlicensed developer I had to release all my titles for free; hardly untold riches! Regardless, I am proud of my titles even if only a handful of people were ever able to enjoy them.
My portfolio of titles and expertise in new hardware allowed me to get a professional game development job. But after 8 years of doing thankless work-for-hire, I eventually came to the conclusion that I had to leave my paid jobs and strike out on my own if I ever wanted to make a game I was truly proud of. I left my job just as the App Store was launching, though I had no idea it was going to change my life.
Little Things was released a year later. Though it was initially a failure on PC, it was featured by Apple as the iPad App of the Week and I’ve had similar chart-topping success with my other iOS games.
Finally the games industry had come full circle, once again empowering a lone developer with a stable platform, low cost of entry, excellent engines and tools available on the market, and a direct line to customers hungry for more games.
So I have a few pieces of advice for those with a passion for games and a notebook full of game ideas: Continue Reading
A number of readers have written to ask: “I want to make games for a living — how can I get started?”
Here’s advice from someone who crossed that bridge a lot more recently than I did: Adam “Atomic” Saltsman, creator of the phenomenally successful indie game Canabalt.
Today’s aspiring game designers can tap resources we couldn’t have dreamed of in 1980. But as Adam emphasizes, the bottom line is still the same: Don’t wait. Start making games right now.
Adam ‘Atomic’ Saltsman made Gravity Hook, Fathom, Flixel, and Canabalt. Adam also helped make Paper Moon, Cave Story Wii, FEZ, the Game City Idea Bucket, and the Flash Game Dojo. He lives in Austin, TX with his wife Bekah, his son Kingsley, and a couple of pug dogs, where he makes iOS games at Semi Secret Software.
When I graduated from high school in 2000, I knew exactly what I wanted to do with my life: make video games. There was only one serious video game curriculum at the time, offered by the DigiPen Institute, so competition for admission there was pretty intense. I didn’t even apply. The programs at Carnegie-Mellon and MIT were still in their infancy. GAMBIT didn’t exist yet, but they had some other programs that looked interesting. I couldn’t afford the out-of-state tuition, and the enormous in-state college I decided to attend offered a single, solitary 4-credit course on the subject.
Times have changed; finding a satisfying career in video games isn’t the impossible joke it used to be. However, the chasm between “I want to make video games!” and actually making video games still intimidates a lot of people, regardless of age, gender or background. If you find yourself on the wrong side of this abyss, don’t panic! Crossing this gap is a lot less complicated than you might think.
Before we start figuring out how to make our dreams come true, though, let’s clarify what that dream is. Contrary to the funny comic above, what we’re talking about is making games, not playing games. Hopefully this doesn’t surprise you, but these are wholly different activities! Just because you enjoy playing games does not necessarily mean that you will love making them too. There’s only one way to find out, of course, but now is a good time to seriously consider whether you really love the act of creation. There is no position at any company in the world that involves just playing games for fun. Seriously, ask a video game tester how much “fun” it is to play the same level 6000 times…
But our game-making dream still needs a bit more clarity. After all, a significant portion of the modern video game industry revolves around pumping out rushed, under-budget game versions of cartoon franchises to whatever console happened to be left over during publisher negotiations (this is not a slam on folks that do that work for a living; their dedication and resourcefulness impresses the heck out of me). So our dream is not just to make any old games, but to make satisfying, interesting games that reflect our passions and interests, whatever those may be.
So how do we do that? How do we escape from our IT/retail/food-service gig and start making games for a living? Continue Reading
I finally read Apple co-founder Steve Wozniak’s great memoir this week — prompted by the tsunami of media commentary on the resignation of Steve Jobs (you know, the other guy). It got me thinking about what an incredible impact stuff made or sold by those two Steves has had on my life over the past three decades.
I was a sophomore in high school when I bought my first Apple II. It cost $1200 at the Computerland of Fairfield, Connecticut — my life savings, including all my loot from years of drawing caricatures at community fairs, plus a loan from my kid sister.
I remember opening the box, lifting the computer out of those custom-molded foam packing pieces. The tactile thrill of owning an Apple began before I’d even plugged the thing in. I knew it was going to change my life.
I hooked it up to an old TV and a cassette recorder, and I was up and running.
Weekends and after school (and sometimes instead of school), I progressed from typing in BASIC game program listings from the red book that had come with the Apple (Breakout was the best), to inventing my own games — first in BASIC, then in 6502 machine code, using the built-in mini-assembler. I pored through the red book, trying to understand its secrets.
As soon as I could afford it, I increased the Apple’s 16K of RAM by adding another row of chips, and then another. Each enhancement unlocked new capabilities: hi-res graphics, then two-page hi-res. Newer, more sophisticated games like Apple Invader (a pixel-perfect copy of the coin-op Space Invaders, programmed by the mysterious M. Hata) pushed the machine’s limits beyond what I’d imagined possible. I realized the games I’d programmed so far hadn’t scratched the surface of what it could do.
I brought my Apple to college. Tricked out with a dot-matrix printer, 5 1/4″ floppy disk drive, lower-case adapter chip, and new word-processing software that could hold up to four pages in memory, it replaced a portable Smith-Corona typewriter as my go-to device for writing papers. I was the only kid in my dorm who had such an awesome system. I used it to earn extra cash typing other people’s papers for a buck a page.
Between classes (and instead of them), I used it to make a game called Karateka.
The Karateka royalties bought me a brand-new 512K Macintosh computer, through a special student-discount arrangement Apple had with Yale.
Macs started popping up all around campus that year. It was still unusual for a student to actually own one — the only other guy I knew who had one was David Pogue, down the hall — but anyone could use the ones in the computer rooms, and a lot of people did.
The Mac had a tiny, but amazingly high-resolution screen, with a mouse-driven graphical interface that gave it a totally different vibe from other computers. It was a device that even non-techies felt comfortable using. And it could hold 100 pages of text in memory. The Mac changed playing games and typing papers on computers from a fringe activity into part of mainstream college life.
I loved my Mac. It was a shiny new toy — good to write papers on, fun to show off to friends — but I didn’t consider it a machine for serious programming. I wasn’t enough of an engineer to pop the hood and figure out how it worked and what all the chips did, the way I’d done with the Apple II. It was too sophisticated.
Besides, the installed user base of Macs in 1985 was miniscule compared to the Apple II. As a game programmer, it didn’t make business sense for me to switch.
So my new Mac took its place alongside my main working system — which I’d by then upgraded to a newer Apple IIe with 64K of RAM, two disk drives, color monitor and joystick. That was the computer I used to program Prince of Persia.
I hadn’t anticipated that, due to my combination of obsessive perfectionism and occasionally dilatory work habits, Prince of Persia would take me four years to finish. By the time I was done, the Apple II was obsolete.
Ironically, it was the Mac version that saved my new game from oblivion. While the Apple market was dying, the rise of desktop publishing had created a new market of Mac owners hungry for games to play on their high-resolution color screens. They embraced Prince of Persia and made it a hit.
Today, like almost everyone I know, my daily life is inextricably bound up with Apple products. I’m typing this in a café on a MacBook Air, with an iPad and iPhone in my shoulder bag, and more Macs and iProducts on view at the tables around me than I can count.
Devices that in ten years will seem as quaint as my 1978 Apple II does now.
But oh, man, it was a thing of beauty.
Survived another Electronic Entertainment Expo, and I even got a few minutes to sketch between meetings.
The LA Convention Center felt much quieter compared to previous years. Restaurants had plenty of tables, and on the show floor you could actually hear yourself talk.
I’ll be speaking at the Nordic Game 2011 conference next week in Malmö, Sweden. The theme of this year’s conference is “Creativity and Entrepreneurship” and they’ve asked me to give a keynote on the subject of “Transmedia.” (No, I don’t know what it means, either — I’m putting my presentation together today, so if you have any ideas, shoot them over quick!)
Hope to see some of you there. And Mom, if you’re reading this, Happy Mother’s Day!