Search






My Shopping Cart

[ 0 ] items in cart

View Cart | Checkout


Game Developer Research
bullet Research Reports

Gamasutra
bullet Contractor Listings

GDC Vault
bullet Individual Subscription

GDC Audio Recordings
bullet App Developers Conference 2013
bullet GDC Next 2013
bullet GDC Europe 2013
bullet GDC 2013
bullet GDC Online 2012
bullet GDC Europe 2012
bullet GDC 2012
bullet GDC 2011
bullet GDC 10
bullet GDC 09
bullet GDC Austin 08
bullet GDC Mobile 08
bullet GDC 08
bullet GDC Austin 07
bullet GDC Mobile 07
bullet GDC 07
bullet GDC 06
bullet GDC 05
bullet GDC 04
bullet GDC 03
bullet GDC 01
bullet GDC 2000 & Before


Newest Item(s)
bullet

Why Now Is the Best Time Ever to Be a Game Developer

Ingress: Design Principles Behind Google's Massively Multiplayer Geo Game

Playing with 'Game'

Gathering Your Party with Project Eternity (GDC Next 10)

D4: Dawn of the Dreaming Director's Drama (GDC Next 10)

Using Plot Devices to Create Gameplay in Storyteller (GDC Next 10)

How I Learned to Stop Worrying and Love Making CounterSpy (GDC Next 10)

Luck and Skill in Games

Minimalist Game Design for Mobile Devices

Broken Age: Rethinking a Classic Genre for the Modern Era (GDC Next 10)


Storefront > GDC Vault Store - Audio Recordings > More GDC > GDC 2003


View larger image
 


QTY:

Automated Testing of Massively Multiplayer Games: Lessons Learned from The Sims Online
Price $5.95
Adjustment
Type
Stock Unlimited
Status
Weight 7 lb, 11 oz
SKU GDC-03-124
Statistics
Description
Automated Testing of Massively Multiplayer Games: Lessons Learned from The Sims Online,
839

Programming, Lecture

Larry Mellon
, EA/maxis
The development and operation of massively multiplayer Persistent State Worlds has proven to be difficult. The distributed nature and scale of the target system increases the complexity of debugging the implementation, while the size, scope and constantly evolving nature of the feature set incurs a high regression cost. The Sims Online encountered stability and scalability issues early in the development cycle. To address these major cost drivers, TSO shifted to a development approach revolving around a set of automated testing tools. Portions of the TSO game client are used to assemble a test client that interacts with servers in the normal manner. A scripting system is attached to this test client at the same entry points as the GUI, using a Presentation Layer to provide a semantic abstraction of the UIs functionality. Scripts may thus mimic a series of user actions. Remote process control and synchronization primitives add single-script control for multiple clients. A single, data-driven test client thus supports developer pre-checkin testing, QA feature regression and load testing. This paper addresses the design, implementation and fielding of such tools. The effects of introducing automated tests in the day to day development and debugging of a massively-multiplayer game are discussed, while lessons learned illustrate the issues in fielding such a tool for a large, constantly evolving distributed system.

The audience should leave with the ability to rapidly convert any Internet game client, or any single player game, into a test client with strong regression and debugging capabilities. Strategies for increasing the testability of a system via simple abstractions are presented, resulting in increased stability during fielding and lower development costs. Through both the above, the audience encounters specific examples of problems encountered and lessons learned while implementing and fielding TSOs distributed system toolkit.

Please leave this field blank.

There are no related products to display.

Related Products...

Please leave this field blank.