There has been much debate in gaming circles about a possible second crash in the video game industry. Some hold the view that this is just a case of alarmism, whilst others affirm that a crash will indeed occur. In order to understand the possibility of a crash, we must first look at the turbulent days of the early 80’s, when the first crash happened.
The inception of the first video game crash is heavily disputed, however, it is common knowledge that video games originated in university labs, when computer engineering students used available hours to program games. Personal computers are common household goods, but in the past, computers were extremely expensive. Costing around $120,000 each and taking up entire rooms, these could not be afforded by individuals.
The first known computer game with a screen display is Spacewar! Programmed in 1962 by Steve Russell, Spacewar was a two player game in which each player had control of a spacecraft. The opposing players had to shoot the other whilst avoiding falling into the sun. The game was revolutionary for its time, and it spawned several copies in different universities. Unfortunately, due to their high costs of computers, the game could never get off campus.
Early video games like Spacewar managed to captivate a whole new generation of software engineers who would go on to create the industry we know today. One of the students who were captivated by one of Spacewar many offshoots was Nolan Bushnell.
Bushnell saw the potential of video games as he worked part time in an amusement park, so after graduating, Bushnell decided he would make video games. He approached a colleague, Ted Dabney, who teamed up with Bushnell and formed Syzygy in 1969. The first game they made was Computer Space, an arcade version of Spacewar. Computer Space made $3M, but was still considered a flop; the poorly designed controls made it difficult to play and so the game never caught on.
Undeterred, Bushnell and Dabney continued making arcade games and in 1972 they registered their company as Atari Inc since Syzygy was already taken. Later that year, Bushnell hired Allan Alcorn to make an arcade version of the Magnavox Odyssey tennis game which he saw in a computer demonstration expo. He called his game Pong.
Unlike computer space, Pong was simple to play and hard to master, making Pong very entertaining. It became a massive success and Atari went on to make a fortune. In 1975, Atari began developing home consoles which included Atari’s most successful arcade games. In 1976, Bushnell sold Atari off to Warner Entertainment for $32M. ($133M today adjusted for inflation)
The year was 1983. Bushnell had sold Atari to Warner 7 years ago, now Atari was secretly dumping millions of game cartridges in a New Mexico desert. Video game studios were closing one after another and the entire videogame industry seemed like it was in peril. In a mere six months, the fastest growing industry in America’s history had tanked from a value of $3.2B to $100M, a 97% decrease. What went wrong?
One thing to remember with the videogame crash of 1983 is that whilst the crash only affected the European and American markets, and whilst the console market was obliterated, PC gaming remained stable and was even attributed as part of the collapse of console video games.
Too Many Consoles
When the crash came about, the industry had only been in existence for 12 years, and most of it had predominantly been dominated by the arcades. In 1975, Atari released the pong console which, for the first time, could be played in American homes. What ensued was a barrage of new consoles year on year. Video games were very profitable and several toy and tech companies, attracted by the high margins, began developing “mee too” consoles. Some companies released several consoles in the same year, as games back then were built into the machine. In 1977, Atari released the Atari 2600, which introduced the ROM cartridge, allowing the Games to be separate from the console for the first time.
Atari had risen to prominence thanks to their popular arcade games. By 1980, they were one of the fastest growing companies in the US. Unfortunately, Atari’s creative culture had changed drastically since it was sold to Warner, and their success led to internal friction between the programmers and the executives.
Atari didn’t want developers putting their names on the games they designed, and the pay grades between executives and developers widened. Many of Atari’s top programmers left and formed their own companies.
Internally, there were conflicts between management and the creative personnel of the company. As a result, many key game developers left the company and started their own business when the opportunity arose. In order to retain its talented personnel, increasingly high bonuses were paid to those that stayed, which proved to be a substantial burden once the market experienced a downturn.
Too Many (Bad) Games
The Atari 2600 revolutionised the console market with its cartridge system. Unlike modern consoles, the Atari 2600 did not feature a lock out chip. The employees that left Atari began exploiting this to their advantage by developing games for the 2600 and other consoles.
Amateur programming enthusiasts quickly realised that they could make their own games and generate vast profits. Companies who were not involved in video game development began hiring people off the street in order to partake in the bonanza.
Meanwhile, executives in Atari began seeing their profits dwindle as more and more newcomers came onto the market. In desperation, Atari got the rights for Pacman and promised developer Todd Frye a royalty for every game sold. The game was rushed and Atari lost a fortune. Atari executives then bought the rights to Spielberg’s E.T and frantically began trying to make a game in time for Christmas. Developers were given only six weeks. It ended in disaster.
Whilst games like Atari’s E.T were poorly made, there were far worse games, and far greater in number. Companies who had no knowledge in game development, but who still wanted to make a quick buck, flooded the market with cheap, poor quality games. It’s not that there weren’t any good games at the time—it’s that the bad games retailed for $1, and there were far too many of them.
Why buy a $19.95 game when you can buy 19 for the same price?
Unfortunately, consumers got burnt by all the sup bar games and they eventually stopped purchasing games altogether.
When the crash came, the videogame industry was still in its infancy. The smaller game developing “companies” were comprised of programming enthusiasts who did not know how to carry out business and were prone to making poor decisions.
In order for the smaller publishers to sell their games, they needed retailers to sell their games in stores, leaving them with little leverage to bargain with. Retailers demanded that all unsold games had to be bought back by the studio. This passed the losses of unsold games onto the developers, who did not foresee a diminishing market. Retailers, carried away by visions of big profits, placed massive orders that exceeded demand. Developers went bankrupt from the unsold games that they were ordered to manufacture.
There came a point where retailers had nowhere to send the unsold games to, as the developers had gone under. Wall Street panicked, developers stock prices tumbled, millions of dollars were wiped out in record time. Gaming was thought to have been a fad.
House Of Cards
Lured by the high profits generated by Atari, several tech companies rushed to the market and put out their own consoles in order to compete.
Disenfranchised developers left Atari to form their own development studios, leaving the company with a deficit in human capital. The developers formed their own companies and started releasing their own games for the Atari 2600 and other platforms, cutting into Atari’s margins. Atari was left with having no option but to hire mediocre programmers at high rates.
Word spread that anyone could make games on consoles. Amateur programmers and non tech companies alike rushed in to develop their own games. The amateur programmers got into disadvantageous contracts with retailers which put them at financial risk should the game sales falter. The non-tech companies produced cheap games that were not fun to play and which were nothing more than an advertisement.
Once the market was flooded, a series of poor decisions and rushed production by Atari caused severe backlash.
In the end, consumers lost all trust in video games and abandoned the medium. The video game industry looked like it would never recover. Fortunately, a Japanese card company decided that video games deserved another chance.
A Japanese card manufacturer called Nintendo decided that the North American video game crash, or the “Atari Shock” as it was known in Japan, occurred to poor decisions by its executives as opposed to a dying fad.
Nintendo decided to enforce quality control by creating a lock out chip, a distinct technology Atari wasn’t able to with the 2600. Nintendo could grant licenses to third party developers for a fee, keeping Nintendo profitable, and control what games went on their console platform.
If a game was considered poor in quality, Nintendo wouldn’t license it. If the third-party game met Nintendo’s criterion, then the game would be allowed on their platform and Nintendo would give the game a certificate known as the Nintendo Seal of Quality. Gamers in the 80’s and 90’s rewarded Nintendo with their loyalty and money. The NES went on to create an empire for Nintendo and some of the greatest games of all time. To date, Nintendo has kept this policy in place and the Seal of Quality can still be seen on all Games that are made for the Wii and the 3DS
Is today’s gaming industry over-valued? Is there a surplus of unoriginal, low-quality games that consumers will eventually refuse to buy? Will a new Nintendo arise out of the ashes? I will continue exploring this in the next “State of the Industry” report.
Don’t Miss Future Articles Like This: Follow Us On Facebook