Tuesday, October 1, 2013

GTA Online launches, but server issues are likely

8:11 AM By


GTA Online launches, but server issues are likely

GTA Online launches on Tuesday - but will the servers be able to cope?
Online video game launches tend to be precarious things. Publishers can buy in vast networks of servers and hire the best software engineers to ensure gamers aren't left out in the cold. But when many thousands hit the infrastructure at once, the results can be wildly unpredictable. We only have to look at the chaotic arrival of EA's Sim City or Blizzard's Diablo III for evidence of this. Both games pretty much collapsed at launch, with fans unable to sign in for many hours and instead jamming the forums with questions and complaints.
So this is going to be an edgy day for Rockstar. Released last month to huge critical acclaim, Grand Theft Auto V has sold in the region of 15m copies, making it the year's biggest entertainment release. And today at noon the publisher is unlocking the ambitious multiplayer mode: Grand Theft Auto Online. Providing fans with a persistent version of the game's world, Los Santos, owners of GTA V can access the new mode for free, join gangs, and get involved in heists, races, death matches and other nefarious activities, all the time earning Reputation Points which unlock new goodies such as weapons and cars.
It's essentially a massively multiplayer game, like World of Warcraft, but set in a modern city rather than a fantasy kingdom. Each time you play, you enter a server that is currently restricted to 16 players, but the entire universe is interconnected – there's even a persistent economy and a stock market that changes dynamically based on what players all over the world spend their in-game cash on.
Of course, with the launch merely hours away the big question is, will it work? Rockstar itself has already admitted that server problems are likely. In a post on its Newswire site last week, the publisher stated:
One thing we are already aware of, and are trying to alleviate as fast as we can, is the unanticipated additional pressure on the servers due to a significantly higher number of players than we were anticipating at this point – we are working around the clock to buy and add more servers, but this increased scale is only going to make the first few days even more temperamental than such things usually are.
Indeed, GTA V owners are already experiencing some of the problems of a connected gaming experience. The game's smartphone app, iFruit, which lets users train lead character Franklin's dog as well as customise cars, has been besieged by server connection delays, as has access to the in-game Snapshot app, used to take and share photos. Some fans are mystified as to why Rockstar failed to anticipate this demand – after all, GTA IV sold more than 25m copies.
But anticipation of huge gamer numbers is one thing, actually preparing for it is quite another. It's not just a case of connecting everyone to one giant online brain. "Although we use the term 'servers' in a lot of cases it's not like there's actually one machine – or a set of machines – that handle the whole job," says Ed Fear, a creative producer at UK studio Mediatonic. "You have machines that do certain functions, or contain certain zones of the world, or hold certain data, and they all operate together to give the impression that you're interfacing with one 'thing'. Unexpected load on one of these areas can unbalance the whole caboodle."
Developers do have a range of methods for load-testing server networks, from running beta programmes (limited numbers of gamers testing the game before release), to designing AI bots that work as virtual testers, to loading the infrastructure with fake work to test the strain.
These are useful tools, but they can only ever provide a limited glimpse at what may happen at launch. "The challenge is scale," says veteran game developer Martin Hollis. "It doesn't matter how much you test with 10 players or with 10,000 players, you can't always exactly anticipate what will happen with 1,000,000 players. Of course you should do your homework. Of course you should test with 1m robots, or more exactly, with automated scripts generated by people experienced in preparing automated scripts for testing massively multiplayer games. You can simulate a lot. But you can't simulate everything, and bitter experience has shown that you can't foresee everything. There is always a surprise or two with every massively multiplayer game, doubly so if you are dealing with millions of players in a spike."
What pre-release testing also fails to account for is the sheer unpredictability of the consumer swarm. In the aftermath of EA's troubled SimCity launch earlier this year, Lucy Bradshaw, studio head at the title's developer Maxis, told Polygon, "A lot more people logged on than we expected. More people played and played in ways we never saw in the beta."
It's the last bit that is most telling. Beta testers tend to understand and work within the confines of unfinished software. They're often fans of the developer or the series, and they feel honoured to be part of the testing process. Paying consumers will react very differently to a slow server or bugs in the system – they'll re-start, they'll log out and log back in – they'll get angry. Furthermore, if they do get on, they will interact with the game in very different ways, testing its boundaries, trying weird stuff – doing things the developers didn't expect. You can't predict what 2m people are going to do.
And you also can't predict the vagaries of several million different internet connections, broadband suppliers, and computer infrasturctures along the pipeline of the internet. "The real key to this is that games require all of the clients (usually 32 to 64) to be perfectly synched," says Andrew Smith of Split Milk Studios. "As soon as one player becomes out of step, the knock-on effects become nightmarishly difficult to contain.
"Picture a ten-pin bowling scenario: the bowler is the server or the game, and the pins are the clients or players. As the ball hits, the first pin falls, and knocks on to the next and it all works. However, if any pin freezes in place for a split second, at random, it completely ruins your ability, as a bowler, to reliably and accurately go for strikes. Now apply that to a game as complex as GTA and you can see why a simple beta is never going to be enough – there are millions of pins."
So the key message from Rockstar is, bear with us. This is effectively a mass open beta. As the website puts it: "The first couple of weeks we expect to be heavily focused on tuning the experience as it goes from internal testing to the reality of being played by tons of people in the real world so that all the usual teething problems for an online game are overcome. We hope it will all run incredibly smoothly, but please bear with us if it doesn't, and help us fix any and all problems!"
In short, if you were planning to take the afternoon off to go online with your friends and rob security trucks, you may want some back up entertainment prepared for when the servers slow, or things go wrong. Because if past experiences of major online game launches tell us anything it's that it won't just be the flash cars inside GTA Online that will be crashing.

Are you being served? The challenge of running multiplayer games – a developer speaks

GTA 5 screenshotGTA 5: 'The number one problem is bottlenecks'
Greg Booker is principal engineer at Born Ready Studios, the UK developer of space combat game, Strike Suit Zero. Here, he provides an inside guide on the problems of building and preparing an online video game.
"The number one problem is bottlenecks. You can distribute many aspects of your server infrastructure, but if you have a point where the need for data coherency outweighs the desire to parallelise your systems then that is likely to become the weak point in the chain. For example, after a maintenance period, your servers are down, when the systems come back up, everyone will be wanting to login – which then really hammers that part of the system. And distributing login systems becomes harder as they all need access to the same data.
"You can distribute the data based on geographical location, but that then runs the risk of increasing costs whilst not actually matching the levels of demand. There are many ways to approach this, but the fact remains that this is something that is still relatively new to a lot of game developers and trying to do it yourself can often result in a system that works but is very susceptible to collapse under load. Building systems that scale well under load is not a new problem, but it is new for game developers and quite often the budgets in terms of time and staff expertise will be skimped on in favour of focusing on much more visible, whiz-bang features.
"Finding good beta testers is also hard – people generally want to play, not test the game. Professional QA teams will put together detailed test plans, which ensure coverage of all aspects of the game, but executing that at the level required to fully stress your back-end systems would require an unfeasibly large QA department. If you architect your systems well enough, you can build automated tests for this, but humans will introduce a degree of unpredictability that can't really be planned for (and therefore, tested). If you have test teams in Paris, London and Frankfurt, you may find that your network connectivity to say, Rome, has issues with bandwidth and latency that you didn't anticipate.
"Consumer broadband is in a better place now that it was 10 or even five years ago. ISPs are more aware of gaming and they tend to provide routers that will work with services such as Xbox Live and PSN without problems. The good ones will ensure that traffic for those systems is routed and prioritised to give their customers a good experience.
"That said, there are still many factors involved and someone using a Wi-Fi connection in a noisy (electro-magnetic) environment may see spikes of packet loss and/or latency that can have consequences for the quality of the experience they get. In the worst case, it may result in them appearing as having lost their connection to your server. They then have to reconnect which can put further stress on those login servers. A brief outage in a major backbone on the Internet is not unheard of, the effect on Netflix, iPlayer may even be masked by the buffering those systems use. You probably won't notice when using email or the web. But if you're in an online game session, it could cause a few thousand players to lose their connection to the server. They then all have to login again, and the system comes under increased pressure.
"The biggest problems come from spikes in activity and your systems not being able to adequately handle those spikes. Companies that run MMO games have gained a lot of experience in this. Games like Grand Theft Auto started out with much simpler multiplayer expectations, with the game experience being much more transient and isolated. Now, as the amount of persistent data about a player increases and they make progress from one session to the next, this leads to more data storage. The amounts are trivial compared to the storage capacity of modern cloud services, but you then need to have robust systems to store, retrieve and update that data. It needs to be fault-tolerant and above all resistant to corruption or loss. The only thing that will infuriate your player base more than not being able to play the game is losing some or all of their progress."

Article Source here
Author:

Ping your blog, website, or RSS feed for Free ping fast  my blog, website, or RSS feed for Free

0 comments:

Post a Comment